Loading IconFastLaunchAPI

Getting Started

Quick start guide for FastLaunchAPI with authentication, payments, and modern tooling

Overview

This FastAPI template eliminates months of boilerplate setup by providing a production-ready backend architecture. Built with modern Python practices and async SQLAlchemy 2.0, it includes everything you need to launch a robust web application: secure authentication with OAuth, payment processing, background task management, and a clean, maintainable codebase.

Perfect for startups, MVPs, and enterprise applications that need to move fast without sacrificing code quality or security.

What's Included

Architecture Overview

The template follows a clean, modular architecture designed for scalability and maintainability:

__init__.py
main.py
celery_setup.py
conftest.py
utility.py
pyproject.toml
uv.lock
.python-version
.env.sample
alembic.ini
Procfile
runtime.txt
docker-compose.yaml
Dockerfile
README.md

Key Architecture Principles:

  • Async-First: Built with async SQLAlchemy 2.0 and asyncpg for high-performance database operations
  • Dual Session Support: Async sessions for FastAPI endpoints, sync sessions for Celery tasks
  • Modular Design: Each feature (auth, payments, email) is self-contained with its own routes, models, services, and tasks
  • Dynamic OAuth: Automatic provider detection and credential injection from environment variables
  • Background Processing: Celery with Redis for heavy tasks, proper session management in tasks
  • Modern SQLAlchemy: Using 2.0 syntax with select(), execute(), and proper async patterns
  • Modern Package Management: uv for lightning-fast dependency installation and management
  • Configuration Management: Centralized settings with Pydantic validation and environment-based config
  • Testing Ready: Comprehensive test setup with fixtures and database isolation

Quick Setup

Prerequisites

Before you begin, ensure you have the following installed:

  • Python 3.12+ - The template requires Python 3.12 or higher
  • uv - Modern Python package manager (10-100x faster than pip)
  • Docker & Docker Compose - For running PostgreSQL, Redis, Celery, and monitoring tools
  • Git - Version control

Install uv: - Unix/macOS: curl -LsSf https://astral.sh/uv/install.sh | sh - Windows: powershell -c "irm https://astral.sh/uv/install.ps1 | iex" - Or via pip: pip install uv

Using Docker Compose is recommended - it automatically sets up PostgreSQL, Redis, Celery workers, Celery Beat, pgAdmin, and Flower.

Clone the Repository

Clone the template repository to your local machine:

git clone https://github.com/yourusername/fastlaunchapi-premium.git
cd fastlaunchapi-premium/backend

Configure Environment

Copy the sample environment file:

cp .env.sample .env

Update the .env file with your configuration:

.env
# Security - Generate a secure random string (min 32 characters)
SECRET_KEY=your-secret-key-min-32-chars-long-random-string-here

# Database - Must use asyncpg driver for async SQLAlchemy 2.0
DATABASE_URL=postgresql+asyncpg://postgres:postgres@postgres:5432/fastlaunchapi

# Backend & Frontend URLs
BACKEND_URL=http://localhost:8000
FRONTEND_URL=http://localhost:3000

# OAuth Providers (Google - add more providers as needed)
# Note: No GOOGLE_REDIRECT_URI needed - automatically generated from BACKEND_URL
GOOGLE_CLIENT_ID=<YOUR_GOOGLE_CLIENT_ID>
GOOGLE_CLIENT_SECRET=<YOUR_GOOGLE_CLIENT_SECRET>

# Stripe Payment
STRIPE_PUBLIC_KEY=<YOUR_STRIPE_PUBLIC_KEY>
STRIPE_SECRET_KEY=<YOUR_STRIPE_SECRET_KEY>
WEBHOOK_SECRET=<YOUR_STRIPE_WEBHOOK_SECRET>

# AI APIs
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
GROQ_API_KEY=<YOUR_GROQ_API_KEY>

# Redis (for Celery)
REDIS_DSN=redis://:yourpassword@redis:6379/0

# Email (SendGrid)
SENDGRID_API_KEY=<YOUR_SENDGRID_API_KEY>
SUPPORT_EMAIL=[email protected]
FROM_EMAIL=[email protected]
COMPANY_NAME="Your Company Name"

Critical Configuration: - SECRET_KEY: Generate a secure 32+ character random string - DATABASE_URL: Must use postgresql+asyncpg:// for async support - BACKEND_URL: Used to auto-generate OAuth redirect URIs (e.g., {BACKEND_URL}/auth/callback/google) - REDIS_DSN: Required for Celery background tasks

Start Docker Services

Start all backend services with Docker Compose:

# From the project root directory
cd ..
docker-compose up -d

This starts:

  • PostgreSQL - Database server (port 5432)
  • Redis - Message broker for Celery (port 6379)
  • Celery Worker - Processes background tasks
  • Celery Beat - Scheduler for recurring tasks
  • pgAdmin - Database management UI (http://localhost:5050)
  • Flower - Celery monitoring UI (http://localhost:5555)

All backend services are now running in Docker containers!

Install Python Dependencies

Install dependencies using uv (includes creating virtual environment automatically):

cd backend
uv sync --all-extras

This command:

  • Creates a virtual environment automatically (.venv)
  • Installs all production dependencies from pyproject.toml
  • Installs dev dependencies (black, ruff, mypy, pytest)
  • Generates uv.lock for reproducible builds
  • Completes in seconds (10-100x faster than pip)

uv handles everything! No need to manually create or activate virtual environments. Just run uv sync and you're ready.

Manual activation (optional): If you need to activate the venv manually:

source .venv/bin/activate  # Unix/macOS
.venv\Scripts\activate     # Windows

Setup Database

Run database migrations to create tables:

# Create initial migration (if needed)
uv run alembic revision --autogenerate -m "Initial migration"

# Apply migrations
uv run alembic upgrade head

The database is automatically created by Docker. Migrations create the table structure.

Start the Application

Start the FastAPI development server with uv:

uv run fastapi dev main.py

Or using uvicorn directly:

uv run uvicorn main:app --reload --host 0.0.0.0 --port 8000

uv run automatically uses the project's virtual environment - no activation needed!

Verify Installation

Your FastAPI application should now be running! Visit these URLs to verify:

Success! Your FastLaunchAPI is ready for development.

Core Components

Authentication System

Complete JWT-based authentication with OAuth2 support, dynamic provider system, and enterprise-grade security features.

Core Features:

  • JWT access tokens with refresh token rotation
  • Secure password hashing using bcrypt
  • Email verification with secure token generation
  • Password reset flow with time-limited tokens
  • Dynamic OAuth2 providers - Easy to add new providers (Google, GitHub, etc.)
  • Automatic OAuth redirect URI generation from BACKEND_URL
  • Session-based state management for OAuth flows
  • FastAPI security with OAuth2PasswordBearer

How OAuth Works:

  1. OAuth providers auto-register from environment variables ({PROVIDER}_CLIENT_ID, {PROVIDER}_CLIENT_SECRET)
  2. Redirect URIs automatically generated: {BACKEND_URL}/auth/callback/{provider}
  3. No hardcoded redirect URIs in environment - fully dynamic

Payment Processing

Full-featured Stripe integration with webhook handling, subscription management, and secure payment processing.

Supported Features:

  • One-time payments and payment intents
  • Subscription management (create, update, cancel)
  • Webhook endpoint handling with signature verification
  • Customer management and billing profiles
  • Payment method storage and management
  • Invoice generation and management
  • Proration and billing cycle handling
  • Refund processing
  • Background processing with Celery for webhook events

Email System

Comprehensive email infrastructure with SendGrid, Jinja2 templates, and async background processing.

Email Features:

  • HTML email templates with Jinja2 templating engine
  • SendGrid API integration for reliable delivery
  • Async email sending via Celery - Non-blocking email operations
  • Email verification and welcome sequences
  • Password reset and security notifications
  • Template management with HTML files
  • Background task processing for high-volume sending

Database Layer

Production-ready async database setup with SQLAlchemy 2.0, proper session management, and dual sync/async support.

Database Features:

  • Async SQLAlchemy 2.0 with asyncpg driver for high performance
  • Dual session support:
    • AsyncSessionLocal for FastAPI async endpoints
    • SyncSessionLocal for Celery synchronous tasks
  • Modern SQLAlchemy 2.0 syntax: select(), execute(), update(), delete()
  • Alembic migrations for schema version control
  • Connection pooling and optimization
  • Proper session lifecycle management
  • Transaction handling with commit/rollback
  • Database dependency injection: db_dependency = Annotated[AsyncSession, Depends(get_db)]

Important: Always use SyncSessionLocal in Celery tasks, never the async session!

Background Task Management

Celery integration with Redis broker for scalable background processing, proper database session handling, and comprehensive error handling.

Task Management:

  • Celery worker with Redis broker and result backend
  • Proper session management: Tasks use SyncSessionLocal for database operations
  • Task queue management and prioritization
  • Error handling with automatic retry mechanisms
  • Task monitoring and logging
  • Result backend for task status tracking
  • Task routing and worker specialization
  • Memory management with worker_max_tasks_per_child
  • Background email sending, payment processing, and data exports

Task Pattern:

from app.db.database import SyncSessionLocal
from celery_setup import celery_app

@celery_app.task()
def my_task(user_id: int):
    db = SyncSessionLocal()
    try:
        # Use SQLAlchemy 2.0 syntax
        stmt = select(User).where(User.id == user_id)
        result = db.execute(stmt)
        user = result.scalar_one()
        # Process...
        db.commit()
    finally:
        db.close()  # Always close!

Task Scheduling System

Celery Beat scheduler for automated tasks, maintenance jobs, and recurring operations.

Scheduling Features:

  • Cron-style task scheduling with crontab()
  • Recurring task management (hourly, daily, weekly, monthly)
  • Dynamic schedule updates
  • UTC timezone configuration
  • Task queue routing for scheduled tasks
  • Maintenance window scheduling
  • Background cleanup tasks (e.g., expired tokens)

Example Schedule:

celery_app.conf.beat_schedule = {
    'sample_task': {
        'task': 'app.routers.core.tasks.sample_task',
        'schedule': crontab(minute="0", hour='*/4'),  # Every 4 hours
        'options': {'queue': 'default'}
    },
}

Security Framework

Multi-layered security approach with CORS, input validation, OAuth2 compliance, and protection against common attacks.

Security Features:

  • CORS configuration for cross-origin requests
  • Input validation with Pydantic models
  • SQL injection protection via SQLAlchemy ORM
  • OAuth2 security with state validation
  • Session middleware for OAuth state management
  • Secure password hashing with bcrypt
  • JWT token validation and expiration
  • Environment-based secrets management
  • Webhook signature verification (Stripe)

Development & Production Tools

Complete development environment with Docker Compose, monitoring tools, and production-ready configurations.

Developer Tools:

  • Docker Compose with all services pre-configured
  • pgAdmin for database management
  • Flower for Celery task monitoring
  • Hot reload with uv run fastapi dev and --reload
  • Alembic for database migrations
  • Environment variable management with .env
  • Logging configuration
  • Health check endpoints
  • Code quality tools: black, ruff, mypy
  • Testing framework: pytest with coverage

Package Management with uv

Why uv? - ⚡ 10-100x faster than pip - 🔒 Reproducible builds with uv.lock - 🎯 Automatic virtual environment management - 🚀 Written in Rust for maximum performance - 📦 Compatible with pip and PyPI - 🛠️ Modern dependency resolution

Common uv Commands:

# Add a new package
uv add fastapi

# Add a dev dependency
uv add --dev black

# Update all packages
uv sync --upgrade

# Remove a package
uv remove package-name

# Run any command in the venv
uv run python script.py
uv run black .
uv run ruff check .

# Show installed packages
uv pip list

# Pin Python version
uv python pin 3.12

# Install a specific Python version
uv python install 3.12

Development Workflow

Working with the Database

Critical: Use async patterns in FastAPI endpoints and sync patterns in Celery tasks. Never mix them!

FastAPI Endpoints (Async):

from fastapi import APIRouter, Depends
from app.db.database import db_dependency
from sqlalchemy import select

router = APIRouter()

@router.get("/users/{user_id}")
async def get_user(user_id: int, db: db_dependency):
    stmt = select(User).where(User.id == user_id)
    result = await db.execute(stmt)
    user = result.scalar_one_or_none()
    return user

Celery Tasks (Sync):

from app.db.database import SyncSessionLocal
from celery_setup import celery_app
from sqlalchemy import select

@celery_app.task()
def process_user(user_id: int):
    db = SyncSessionLocal()
    try:
        stmt = select(User).where(User.id == user_id)
        result = db.execute(stmt)  # No await!
        user = result.scalar_one()
        # Process...
        db.commit()
    finally:
        db.close()

Adding OAuth Providers

Adding a new OAuth provider is easy:

  1. Add credentials to .env:
GITHUB_CLIENT_ID=your_github_client_id
GITHUB_CLIENT_SECRET=your_github_client_secret
  1. Register the provider in oauth_providers.py:
oauth.register(
    name='github',
    # Credentials automatically injected from settings
    server_metadata_url='https://github.com/.well-known/oauth-authorization-server',
    client_kwargs={'scope': 'user:email'}
)
  1. Redirect URI is auto-generated: {BACKEND_URL}/auth/callback/github

Background Task Management

Start Celery workers and beat scheduler with uv:

# Start Celery worker (processes tasks)
uv run celery -A celery_setup worker --loglevel=info

# Start Celery Beat (schedules recurring tasks)
uv run celery -A celery_setup beat --loglevel=info

# Monitor with Flower (optional, also available in Docker)
uv run celery -A celery_setup flower

If using Docker Compose, workers and beat are already running! Just restart them: docker-compose restart celery_worker celery_beat

Common Use Cases:

  • Email sending (verification, password reset, notifications)
  • Payment webhook processing
  • Data exports and report generation
  • Database cleanup and maintenance tasks
  • API integrations and third-party syncing
  • Image processing and file uploads

Database Migrations

When you modify models:

# Generate migration from model changes
uv run alembic revision --autogenerate -m "Description of changes"

# Review generated migration in alembic/versions/

# Apply migration
uv run alembic upgrade head

# Rollback last migration
uv run alembic downgrade -1

# View migration history
uv run alembic history

Testing

Run the test suite with uv:

# Run all tests
uv run pytest

# Run with coverage
uv run pytest --cov=app

# Run specific test file
uv run pytest app/routers/auth/tests/test_auth_routes.py

# Run with verbose output
uv run pytest -v

Code Quality Checks

Run code quality tools:

# Format code with black
uv run black .

# Check formatting
uv run black . --check

# Lint with ruff
uv run ruff check .

# Auto-fix ruff issues
uv run ruff check --fix .

# Type checking with mypy
uv run mypy app/

Docker Services Overview

Services Included:

  • PostgreSQL (port 5432): Main database
  • Redis (port 6379): Celery broker and result backend
  • Celery Worker: Background task processor
  • Celery Beat: Recurring task scheduler
  • pgAdmin (port 5050): Database UI ([email protected] / admin)
  • Flower (port 5555): Celery monitoring UI

Useful Commands:

# View all running services
docker-compose ps

# View logs
docker-compose logs -f

# Restart specific service
docker-compose restart celery_worker

# Stop all services
docker-compose down

# Rebuild and start
docker-compose up -d --build

Next Steps

Common Issues & Solutions

Database Connection Failed: - Ensure Docker is running: docker ps - Check DATABASE_URL uses postgresql+asyncpg:// - Verify host is postgres (Docker service name), not localhost

Celery Tasks Not Executing: - Check worker is running: docker-compose ps celery_worker - View logs: docker-compose logs celery_worker - Verify tasks imported in celery_setup.py include list - Ensure .env file exists and REDIS_DSN is correct

OAuth State Mismatch: - Ensure SessionMiddleware is configured in main.py - Check BACKEND_URL matches your actual backend URL - Clear browser cookies and try again - Verify OAuth provider redirect URI matches {BACKEND_URL}/auth/callback/{provider}

Migration Conflicts: - Check history: uv run alembic history - Merge heads: uv run alembic merge heads - Then upgrade: uv run alembic upgrade head

uv not found: - Install uv: See Prerequisites section - Ensure uv is in PATH: uv --version - Restart terminal after installation

Key Takeaways

Modern Architecture: - ✅ Async SQLAlchemy 2.0 with asyncpg - ✅ Dual session support (async for API, sync for Celery) - ✅ Dynamic OAuth with auto-generated redirect URIs - ✅ Proper error handling and retry logic - ✅ Docker Compose for easy local development - ✅ Production-ready security and monitoring - ✅ Modern dependency management with uv - ✅ Lightning-fast package installation (10-100x faster) - ✅ Reproducible builds with lock files

  • ✅ Automatic virtual environment management