Write Once, Deploy Everywhere
Quick Start | Documentation | Architecture | Innovative Features
Paracle is a framework for building production-ready multi-agent AI applications. Designed for scalability, security, and interoperability, Paracle enables organizations to develop sophisticated AI systems with confidence.
|
Implement sophisticated agent hierarchies using object-oriented principles. Inherit configurations, behaviors, and capabilities across agent families for maintainable, scalable systems. Support for 14+ LLM providers ensures vendor flexibility:
Seamless integration with leading AI frameworks: Microsoft Semantic Kernel (MSAF), LangChain, LlamaIndex. Choose the right tool for your use case. Define agent capabilities once, deploy across platforms: GitHub Copilot, Cursor, Claude Code, OpenAI Codex, and custom IDEs. |
Production-grade RESTful API built with FastAPI. Comprehensive OpenAPI documentation, authentication, and rate limiting included. Native support for the emerging MCP standard, enabling standardized tool discovery and interoperability across AI platforms. Federated agent communication protocol supporting distributed multi-agent systems and cross-organization collaboration. Bring Your Own (BYO) architecture: models, frameworks, tools, infrastructure. No vendor lock-in.
|
|
Using uv (Recommended) uv pip install paracle |
Using pip pip install paracle |
|
API Keys Setup # Copy example and add your keys
cp .env.example .env
# Edit .env with your API keysπ API Keys Guide |
paracle helloInteractive Tutorial (30 minutes hands-on training)
paracle tutorial startTraining Modules:
- Agent creation and configuration
- Tool integration (filesystem, HTTP, shell)
- Skills definition and deployment
- Project template development
- Local testing and validation
- Workflow orchestration
Resume anytime: paracle tutorial resume
# Initialize workspace
paracle init
# List available agents
paracle agents list
# Run an agent with a task
paracle agents run coder --task "Create a hello world script"from paracle_domain.models import AgentSpec, Agent
# Define an agent
agent_spec = AgentSpec(
name="code-assistant",
description="A helpful coding assistant",
provider="openai",
model="gpt-4",
temperature=0.7,
system_prompt="You are an expert Python developer."
)
agent = Agent(spec=agent_spec)
print(f"β
Agent created: {agent.id}")π That's it! You're ready to build AI applications with Paracle!
paracle-lite/
βββ .parac/ # Project workspace (config, memory, runs)
βββ packages/ # Modular packages
β βββ paracle_core/ # Core utilities
β βββ paracle_domain/ # Domain models
β βββ paracle_store/ # Persistence
β βββ paracle_events/ # Event bus
β βββ paracle_providers/ # LLM providers
β βββ paracle_adapters/ # Framework adapters
β βββ paracle_orchestration/ # Workflow engine
β βββ paracle_tools/ # Tool management
β βββ paracle_skills/ # Skills system (multi-platform)
β βββ paracle_mcp/ # MCP protocol client
β βββ paracle_api/ # REST API
β βββ paracle_cli/ # CLI
βββ tests/ # Test suite
βββ content/ # Documentation and templates
β βββ docs/ # User documentation
β βββ templates/ # Project templates
βββ content/examples/ # Example projects
Paracle follows a modular monolith architecture with clear boundaries:
- Domain Layer: Pure business logic (agents, workflows, tools)
- Infrastructure Layer: Persistence, events, providers
- Application Layer: Orchestration, API, CLI
- Adapters: External integrations (MSAF, LangChain, etc.)
See Architecture Documentation for details.
Hierarchical Agent Architecture
# Base agent
base_agent = AgentSpec(
name="base-coder",
provider="openai",
model="gpt-4",
temperature=0.7
)
# Specialized agent (inherits from base) π―
python_expert = AgentSpec(
name="python-expert",
parent="base-coder", # β Inheritance magic!
system_prompt="Expert in Python best practices",
tools=["pytest", "pylint"]
)π Multi-Provider Support - Switch providers instantly
# OpenAI π€
agent1 = AgentSpec(provider="openai", model="gpt-4")
# Anthropic π§
agent2 = AgentSpec(provider="anthropic", model="claude-sonnet-4.5")
# Local (free!) π»
agent3 = AgentSpec(provider="ollama", model="llama3")14+ providers supported - Commercial + Self-hosted
Workflow Orchestration
from paracle_domain.models import Workflow, WorkflowStep
workflow = Workflow(
name="code-review",
steps=[
WorkflowStep(
id="analyze",
agent_id="analyzer",
prompt="Analyze this code"
),
WorkflowStep(
id="suggest",
agent_id="advisor",
prompt="Suggest improvements",
dependencies=["analyze"] # β Sequential execution
)
]
)|
πΊοΈ Roadmap β’ π Architecture Decisions β’ π‘ Examples |
||
π§ Setup Development Environment
# Clone repository
git clone https://github.com/IbIFACE-Tech/paracle-lite.git
cd paracle-lite
# Install with dev dependencies
make install-dev
# Or with uv (recommended)
uv sync --all-extrasπ§ͺ Running Tests
# Run all tests
make test
# With coverage report
make test-cov
# Watch mode (auto-reload)
make test-watch700+ tests - Unit, integration, and end-to-end
β¨ Code Quality
# Run all linters
make lint
# Auto-format code
make formatTools: ruff, mypy, black, isort
Paracle v1.0.1 is production-ready! π
Current Phase: Phase 10 - Governance & v1.0 Release (95% complete)
We welcome contributions from the community.
|
1. Fork Fork Repository |
2. Branch Create Feature Branch |
3. Develop Implement Changes |
4. Test Validate Quality |
5. Submit Pull Request |
Licensed under Apache License 2.0
Free and open source for personal and commercial use
|
|
|
π Bug Reports β’ β¨ Feature Requests β’ β Questions β’ π‘ Ideas
All welcome on GitHub Issues and Discussions
Version 1.0.1 700+ Tests | 95/100 Security Score | ISO/SOC2 Compliant
Built with β€οΈ by IbIFACE Team
