Skip to content

mohamedTbarka/turbine

Turbine

High-performance distributed task queue written in Rust

A modern, reliable alternative to Celery with first-class Python support

CI Status Crates.io PyPI Python Versions License

FeaturesQuick StartDocumentationBenchmarksRoadmapContributing


Why Turbine?

Turbine was built to solve the common pain points of Celery while maintaining a familiar API for Python developers:

Celery Pain Point Turbine Solution
High memory usage (Python workers) Rust workers use ~10x less memory
GIL limits concurrency True parallelism with no GIL
Task loss on worker crash Visibility timeout + automatic redelivery
Complex configuration Sensible defaults, single config file
Poor monitoring Built-in Prometheus metrics + Dashboard
Slow cold start Instant startup, no Python import overhead
Result backend reliability Optional results with TTL, S3 offload

Features

  • High Performance: Zero-copy message handling, async I/O, minimal allocations
  • Reliable: Exactly-once semantics (where possible), persistent task state, graceful shutdown
  • Observable: Built-in Prometheus metrics, OpenTelemetry tracing, web dashboard
  • Flexible Brokers: Redis (now), RabbitMQ, AWS SQS (coming soon)
  • Workflows: Chains, groups, and chords for complex task composition
  • Python SDK: Easy integration with Django, FastAPI, and other frameworks

Quick Start

Using Docker Compose

# Clone the repository
git clone https://github.com/turbine-queue/turbine.git
cd turbine

# Start Turbine with Redis
docker-compose -f docker/docker-compose.yml up -d

# Check health
curl http://localhost:50051/health

Building from Source

# Prerequisites: Rust 1.75+, Redis

# Build all crates
cargo build --release

# Run server
./target/release/turbine-server

# Run worker (in another terminal)
./target/release/turbine-worker

Configuration

Turbine can be configured via TOML file, environment variables, or CLI arguments:

# Using environment variables
export TURBINE_BROKER_URL=redis://localhost:6379
export TURBINE_CONCURRENCY=4
./target/release/turbine-worker

# Using config file
./target/release/turbine-server --config turbine.toml

See turbine.toml for all configuration options.

Architecture

┌─────────────────────────────────────────────────────────────────┐
│                    Python/Django/FastAPI App                     │
│                     (turbine-py SDK via gRPC)                    │
└─────────────────────────────────────────────────────────────────┘
                                │
                                ▼
┌─────────────────────────────────────────────────────────────────┐
│                      Turbine Server (Rust)                       │
│  ┌─────────────┐  ┌─────────────┐  ┌─────────────────────────┐  │
│  │  gRPC API   │  │  REST API   │  │   Web Dashboard         │  │
│  └─────────────┘  └─────────────┘  └─────────────────────────┘  │
│                                                                  │
│  ┌──────────────────────────────────────────────────────────┐   │
│  │  Task Router  •  Workflow Engine  •  Scheduler (Beat)     │   │
│  └──────────────────────────────────────────────────────────┘   │
└─────────────────────────────────────────────────────────────────┘
                                │
                                ▼
┌─────────────────────────────────────────────────────────────────┐
│                        Message Broker                            │
│     Redis (Ready)  │  RabbitMQ (Planned)  │  AWS SQS (Planned)  │
└─────────────────────────────────────────────────────────────────┘
                                │
        ┌───────────────────────┴───────────────────────┐
        ▼                                               ▼
┌─────────────────────────────┐     ┌─────────────────────────────┐
│   Turbine Workers (Rust)    │     │   Python Workers            │
│  • High-performance tasks   │     │  • Django/FastAPI tasks     │
│  • Subprocess isolation     │     │  • Native Python execution  │
│  • Memory efficient         │     │  • Auto-discovery           │
└─────────────────────────────┘     └─────────────────────────────┘
                                │
                                ▼
┌─────────────────────────────────────────────────────────────────┐
│                        Result Backend                            │
│              Redis  │  PostgreSQL (Planned)  │  S3               │
└─────────────────────────────────────────────────────────────────┘

Crates

Crate Description Status
turbine-core Core types, traits, configuration ✅ Ready
turbine-broker Message broker abstraction ✅ Redis
turbine-backend Result backend abstraction ✅ Redis
turbine-server gRPC/REST server ✅ Ready
turbine-worker Task execution engine ✅ Ready
turbine-py Python SDK with Django/FastAPI support ✅ Ready

Python SDK

Install the Python SDK:

pip install turbine-queue

# With worker dependencies (for running Python workers)
pip install turbine-queue[worker]

# With Django integration
pip install turbine-queue[django]

Basic Usage

from turbine import Turbine, task

# Initialize client
turbine = Turbine(server="localhost:50051")

# Define a task
@task(queue="emails", max_retries=3, timeout=300)
def send_email(to: str, subject: str, body: str):
    # Task logic here
    pass

# Submit task
task_id = send_email.delay(to="[email protected]", subject="Hello", body="World")

# Check status
status = turbine.get_task_status(task_id)
print(f"Task state: {status['state']}")

Django Integration

# settings.py
INSTALLED_APPS = [
    ...
    'turbine.django',
]

TURBINE_SERVER = "localhost:50051"
TURBINE_BROKER_URL = "redis://localhost:6379"

# tasks.py
from turbine import task

@task(queue="emails")
def send_welcome_email(user_id: int):
    user = User.objects.get(id=user_id)
    # Send email...

# views.py
from .tasks import send_welcome_email

def signup(request):
    user = create_user(request.POST)
    send_welcome_email.delay(user_id=user.id)
    return redirect("home")

Run the Python worker:

# Using CLI
turbine worker --broker-url redis://localhost:6379 -I myapp.tasks

# Using Django management command
python manage.py turbine_worker -Q emails,default

Workflows

from turbine import chain, group, chord

# Chain: tasks run sequentially, passing results
workflow = chain(
    fetch_data.s(url),
    process_data.s(),
    store_results.s()
)
workflow.delay()

# Group: tasks run in parallel
group(
    send_email.s(to="[email protected]", subject="Hi", body="..."),
    send_email.s(to="[email protected]", subject="Hi", body="..."),
).delay()

# Chord: group + callback after all complete
chord(
    [process_chunk.s(chunk) for chunk in chunks],
    aggregate_results.s()
).delay()

Dead Letter Queue (DLQ)

Failed tasks that exceed retry limits are automatically sent to the DLQ for inspection:

# List failed tasks
turbine dlq list

# Show DLQ statistics
turbine dlq stats

# Inspect a specific failed task
turbine dlq inspect <task-id>

# Remove a task from DLQ
turbine dlq remove <task-id>

# Clear all tasks from DLQ
turbine dlq clear --force

Web Dashboard

Turbine includes a comprehensive REST API for real-time monitoring and management:

Starting the Dashboard

# Build the dashboard
cargo build --release -p turbine-dashboard

# Run with default settings
./target/release/turbine-dashboard

# Custom configuration
./target/release/turbine-dashboard \
  --host 0.0.0.0 \
  --port 8080 \
  --redis-url redis://localhost:6379

API Endpoints

The dashboard provides the following REST endpoints:

Health & Overview:

  • GET /api/health - Health check
  • GET /api/overview - Dashboard overview statistics

Workers:

  • GET /api/workers - List all workers
  • GET /api/workers/:id - Get worker details

Queues:

  • GET /api/queues - List all queues
  • GET /api/queues/:name - Get queue details
  • GET /api/queues/:name/stats - Queue statistics
  • POST /api/queues/:name/purge - Purge queue

Tasks:

  • GET /api/tasks - List recent tasks
  • GET /api/tasks/:id - Get task details
  • POST /api/tasks/:id/revoke - Revoke a task

Dead Letter Queue:

  • GET /api/dlq/:queue - Get DLQ info
  • POST /api/dlq/:queue/reprocess - Reprocess DLQ messages
  • POST /api/dlq/:queue/purge - Purge DLQ

Metrics & Events:

  • GET /api/metrics - Prometheus metrics
  • GET /api/events - Server-Sent Events for real-time updates

Example API Usage

# Get overview
curl http://localhost:8080/api/overview

# List queues
curl http://localhost:8080/api/queues

# Get task status
curl http://localhost:8080/api/tasks/task-id-here

# Listen to real-time events
curl -N http://localhost:8080/api/events

Frontend UI: Coming soon! The backend API is complete and ready for a React/Vue/Svelte frontend.

Benchmarks

Coming soon! We're working on comprehensive benchmarks comparing:

  • Memory usage vs Celery
  • Throughput (tasks/second)
  • Latency (p50, p95, p99)
  • Cold start time

Roadmap

Phase 1: Core Foundation ✅

  • Task types and serialization
  • Redis broker implementation
  • Redis result backend
  • Basic worker with task execution
  • gRPC server structure

Phase 2: Python SDK ✅

  • Python gRPC client
  • @task decorator
  • Django integration
  • FastAPI integration
  • Python worker for executing tasks

Phase 3: Reliability & Workflows ✅

  • Retry with exponential backoff
  • Chain, Group, Chord execution
  • Beat scheduler (cron)
  • Dead letter queues (DLQ)

Phase 4: Observability

  • Prometheus metrics
  • OpenTelemetry tracing
  • Web dashboard backend (REST API + SSE)
  • Web dashboard frontend (UI)

Phase 5: Advanced Features ✅

  • Rate limiting
  • Priority queues
  • TLS/mTLS encryption
  • Multi-tenancy

Phase 6: Additional Brokers (Planned)

  • RabbitMQ support
  • AWS SQS support
  • Kafka support

Phase 7: Next Steps (Planned)

  • Dashboard web UI (React/Vue/Svelte)
  • Multi-tenancy with resource quotas
  • Result backend: PostgreSQL
  • Result backend: S3 for large payloads
  • Task result compression
  • Grafana dashboard templates
  • Load balancing strategies
  • Task dependencies and DAGs
  • Batch processing optimizations

Documentation

Contributing

We welcome contributions! Please see our Contributing Guide for details.

Good First Issues

Looking to contribute? Check out issues labeled good first issue.

Areas We Need Help

Area Description Skills
🎨 Dashboard Frontend Build React/Vue/Svelte UI consuming the REST API TypeScript, React/Vue, SSE
🐰 RabbitMQ Broker Implement AMQP 0.9.1 broker support Rust, RabbitMQ
☁️ AWS SQS Broker Implement SQS broker for cloud deployments Rust, AWS SDK
🧪 Benchmarks Performance comparison with Celery (memory, throughput, latency) Python, Rust, Testing
🏢 Multi-tenancy Add tenant isolation and resource quotas Rust, Distributed Systems
📚 Documentation Migration guides, best practices, tutorials Technical Writing
🔧 Examples More example apps (Flask, Sanic, CLI tools) Python

Tech Stack

  • Backend: Rust (Tokio, Tonic gRPC, Serde)
  • Broker: Redis (RabbitMQ/SQS planned)
  • Python SDK: Python 3.9+, gRPC, MessagePack
  • Frameworks: Django, FastAPI

Community

License

Licensed under either of:

at your option.

Acknowledgments

Turbine is inspired by:

  • Celery - The original Python task queue
  • Sidekiq - Ruby's excellent background job processor
  • Tokio - Rust's async runtime

Built with ❤️ in Rust

About

No description, website, or topics provided.

Resources

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors