Quick Start
Get FastLaunchAPI running in minutes with this quick setup guide
Prerequisites
Ensure you have the following installed:
- Python 3.12+
- uv - Modern Python package manager (10-100x faster than pip)
- Docker & Docker Compose (Recommended - includes PostgreSQL, Redis, Celery workers)
- Git
Install uv: - Unix/macOS: curl -LsSf https://astral.sh/uv/install.sh | sh - Windows: powershell -c "irm https://astral.sh/uv/install.ps1 | iex" -
Or via pip: pip install uv
This guide uses Docker for the easiest setup. All services (PostgreSQL, Redis, Celery, pgAdmin, Flower) run in containers.
Clone the Repository
git clone https://github.com/yourusername/fastlaunchapi-premium.git
cd fastlaunchapi-premium/backendConfigure Environment
Copy the sample environment file:
cp .env.sample .envUpdate the .env file with your configuration:
# Security - Generate a secure random string (min 32 characters)
SECRET_KEY=your-secret-key-min-32-chars-long-random-string-here
# Database - Use asyncpg driver for async SQLAlchemy
DATABASE_URL=postgresql+asyncpg://postgres:postgres@postgres:5432/fastlaunchapi
# Backend & Frontend URLs
BACKEND_URL=http://localhost:8000
FRONTEND_URL=http://localhost:3000
# OAuth Providers (Google - add more providers as needed)
# No need for GOOGLE_REDIRECT_URI - automatically generated from BACKEND_URL
GOOGLE_CLIENT_ID=<YOUR_GOOGLE_CLIENT_ID>
GOOGLE_CLIENT_SECRET=<YOUR_GOOGLE_CLIENT_SECRET>
# Stripe Payment
STRIPE_PUBLIC_KEY=<YOUR_STRIPE_PUBLIC_KEY>
STRIPE_SECRET_KEY=<YOUR_STRIPE_SECRET_KEY>
WEBHOOK_SECRET=<YOUR_STRIPE_WEBHOOK_SECRET>
# AI APIs
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
GROQ_API_KEY=<YOUR_GROQ_API_KEY>
# Redis (for Celery background tasks)
REDIS_DSN=redis://:yourpassword@redis:6379/0
# Email (SendGrid)
SENDGRID_API_KEY=<YOUR_SENDGRID_API_KEY>
SUPPORT_EMAIL=[email protected]
FROM_EMAIL=[email protected]
COMPANY_NAME="Your Company Name"Important Configuration Notes: - SECRET_KEY: Generate a secure random
string (min 32 characters) - DATABASE_URL: Must use postgresql+asyncpg://
for async support - BACKEND_URL: Used to auto-generate OAuth redirect URIs -
REDIS_DSN: Update password if needed for production
Start Docker Services
Start all services with Docker Compose:
# From the project root directory
cd ..
docker-compose up -dThis starts:
- PostgreSQL - Database server (port 5432)
- Redis - Message broker for Celery (port 6379)
- Celery Worker - Background task processor
- Celery Beat - Task scheduler
- pgAdmin - Database management UI (http://localhost:5050)
- Flower - Celery monitoring UI (http://localhost:5555)
All backend services are now running in Docker containers!
Install Python Dependencies
Install dependencies using uv (lightning-fast, automatic virtual environment):
cd backend
uv sync --all-extrasThis command:
- ⚡ Creates virtual environment automatically (
.venv) - 📦 Installs all dependencies from
pyproject.toml - 🛠️ Includes dev tools (black, ruff, mypy, pytest)
- 🔒 Generates
uv.lockfor reproducible builds - 🚀 Completes in seconds (10-100x faster than pip)
uv handles everything! No need to manually create or activate virtual
environments. Just run uv sync and you're ready.
Setup Database
Run database migrations to create tables:
# Create initial migration (if needed)
uv run alembic revision --autogenerate -m "Initial migration"
# Apply migrations
uv run alembic upgrade headThe database is automatically created by Docker. These commands set up the table structure.
Start the FastAPI Application
Start the development server with uv:
uv run fastapi dev main.pyOr using uvicorn directly:
uv run uvicorn main:app --reload --host 0.0.0.0 --port 8000uv run automatically uses the project's virtual environment - no activation
needed!
Verify Installation
Your API is now running! Check these URLs:
- API Documentation: http://localhost:8000/docs
- Alternative Docs: http://localhost:8000/redoc
- pgAdmin (Database): http://localhost:5050
- Flower (Celery Tasks): http://localhost:5555
Success! Your FastLaunchAPI is ready for development.
Docker Services Overview
PostgreSQL Database
- Port: 5432
- User: postgres
- Password: postgres (change in production!)
- Database: fastlaunchapi
Redis
- Port: 6379
- Used for: Celery task queue and results backend
- Password: Set in REDIS_DSN
Celery Worker
- Processes background tasks asynchronously
- Automatically loads tasks from
app.routers.*.tasks
Celery Beat
- Schedules recurring tasks (cron-like)
- Configured in
celery_setup.py
pgAdmin (Database Management)
- URL: http://localhost:5050
- Email: [email protected]
- Password: admin
- Connect to PostgreSQL using host:
postgres
Flower (Celery Monitoring)
- URL: http://localhost:5555
- Monitor task execution in real-time
- View worker status and queue metrics
Development Workflow
Making Database Changes
When you modify models:
# Generate migration
uv run alembic revision --autogenerate -m "Description of changes"
# Review the generated migration in alembic/versions/
# Apply migration
uv run alembic upgrade headAdding Background Tasks
- Create task in
app/routers/*/tasks.py - Import in
celery_setup.pyinclude list - Restart Celery worker:
docker-compose restart celery_worker
Testing OAuth
- Set up Google OAuth credentials in Google Cloud Console
- Add redirect URI:
http://localhost:8000/auth/callback/google - Update
GOOGLE_CLIENT_IDandGOOGLE_CLIENT_SECRETin.env - Test at: http://localhost:8000/docs → Try
/auth/googleendpoint
Running Tests
# Run all tests
uv run pytest
# Run with coverage
uv run pytest --cov=app
# Run specific test file
uv run pytest app/routers/auth/tests/test_auth_routes.pyCode Quality
# Format code
uv run black .
# Lint code
uv run ruff check .
# Auto-fix linting issues
uv run ruff check --fix .
# Type checking
uv run mypy app/Package Management with uv
Why uv? ⚡ 10-100x faster | 🔒 Reproducible builds | 🎯 Auto venv management | 🚀 Rust-powered | 📦 PyPI compatible
Common Commands:
# Add a new package
uv add package-name
# Add a dev dependency
uv add --dev package-name
# Update all packages
uv sync --upgrade
# Remove a package
uv remove package-name
# Run any command in the venv
uv run python script.py
# Show installed packages
uv pip listNext Steps
Now that you're up and running:
- Full Getting Started Guide - Detailed setup and architecture
- Authentication - Set up OAuth and user management
- Background Tasks - Learn Celery task patterns
- Database - Work with async SQLAlchemy 2.0
- Payment Integration - Configure Stripe
- AI Integration - Add AI capabilities
Quick Commands Reference
# Docker Services
docker-compose up -d # Start all services
docker-compose down # Stop all services
docker-compose logs -f # View logs
docker-compose restart celery_worker # Restart specific service
# FastAPI Development
uv run fastapi dev main.py # Start dev server with hot reload
uv run uvicorn main:app --reload # Alternative start method
# Database Migrations
uv run alembic upgrade head # Apply migrations
uv run alembic revision --autogenerate -m "Description" # Create migration
uv run alembic history # View migration history
uv run alembic downgrade -1 # Rollback last migration
# Celery Commands
uv run celery -A celery_setup worker --loglevel=info # Start worker
uv run celery -A celery_setup beat --loglevel=info # Start beat scheduler
uv run celery -A celery_setup flower # Start monitoring
# Testing & Quality
uv run pytest # Run tests
uv run pytest --cov=app # Run with coverage
uv run black . # Format code
uv run ruff check . # Lint code
uv run mypy app/ # Type checking
# Package Management
uv add package-name # Add package
uv sync --upgrade # Update all packages
uv pip list # List installed packagesTroubleshooting
Database Connection Issues
Error: could not connect to server
Solution:
- Ensure Docker is running:
docker ps - Check PostgreSQL is running:
docker-compose ps - Verify
DATABASE_URLusespostgresql+asyncpg:// - Check host is
postgres(Docker service name), notlocalhost
Redis Connection Failed
Error: Error connecting to Redis
Solution:
- Check Redis container:
docker-compose ps redis - Verify
REDIS_DSNformat:redis://:password@redis:6379/0 - Ensure Redis host is
redis(Docker service name) - Check
.envfile is loaded by Celery workers
Port Already in Use
Error: Address already in use
Solution:
# Change FastAPI port
uv run fastapi dev main.py --port 8001
# Or with uvicorn
uv run uvicorn main:app --reload --port 8001Celery Tasks Not Executing
Solution:
- Check worker is running:
docker-compose ps celery_worker - View worker logs:
docker-compose logs celery_worker - Verify tasks imported in
celery_setup.py - Restart worker:
docker-compose restart celery_worker
Migration Conflicts
Error: Multiple heads in database
Solution:
# Check migration history
uv run alembic history
# Merge heads if needed
uv run alembic merge heads
# Then upgrade
uv run alembic upgrade headuv Not Found
Error: uv: command not found
Solution:
- Install uv: See Prerequisites section
- Ensure uv is in PATH:
uv --version - Restart terminal after installation
- On Windows, may need to restart PowerShell as admin
Import Errors After Adding Packages
Solution:
# Sync dependencies
uv sync
# If still failing, regenerate lock file
uv lock --upgrade
uv syncProduction Checklist
Before deploying to production:
- Change
SECRET_KEYto a strong random string (useuv run python -c "import secrets; print(secrets.token_urlsafe(32))") - Update database credentials
- Set
REDIS_DSNpassword - Configure proper
BACKEND_URLandFRONTEND_URL - Add all required API keys (Stripe, SendGrid, OpenAI)
- Set up proper OAuth redirect URIs in provider consoles
- Review and update
docker-compose.yamlfor production - Enable HTTPS (SSL/TLS)
- Set up proper logging and monitoring
- Configure database backups
- Scale Celery workers based on load
- Lock dependency versions with
uv.lock(committed to repo) - Set up CI/CD pipeline
- Configure environment variables securely (not in
.envfile)
Security: Never commit .env files to version control. Use environment
variables or secret management systems in production.
uv in Production: The uv.lock file ensures exact same dependency
versions across all environments. Commit it to your repository!