Expand description
§Acton-AI: Agentic AI Framework
An agentic AI framework where each agent is an actor, leveraging acton-reactive’s supervision, pub/sub, and fault tolerance to create resilient, concurrent AI systems.
§Architecture
- Kernel: Central supervisor managing all agents
- Agent: Individual AI agents with reasoning loops
- LLM Provider: Manages streaming LLM API calls with rate limiting
- Tool Registry: Registers and executes tools via supervised child actors
- Memory Store: Persistence via Turso/libSQL
§Quick Start (High-Level API)
The simplest way to use acton-ai is via the ActonAI facade:
ⓘ
use acton_ai::prelude::*;
#[tokio::main]
async fn main() -> Result<(), ActonAIError> {
let runtime = ActonAI::builder()
.app_name("my-app")
.ollama("qwen2.5:7b")
.launch()
.await?;
runtime
.prompt("What is the capital of France?")
.system("Be concise.")
.on_token(|t| print!("{t}"))
.collect()
.await?;
println!();
Ok(())
}§Advanced Usage (Low-Level API)
For full control over the actor system:
ⓘ
use acton_ai::prelude::*;
#[tokio::main]
async fn main() {
let mut app = ActonApp::launch_async().await;
let kernel = Kernel::spawn(&mut app).await;
let agent_id = kernel.spawn_agent(AgentConfig::default()).await;
kernel.send_prompt(agent_id, "Hello, agent!").await;
app.shutdown_all().await.unwrap();
}Modules§
- agent
- Agent actor module.
- config
- Configuration management for acton-ai.
- conversation
- Managed conversation abstraction for multi-turn interactions.
- error
- Custom error types for the Acton-AI framework.
- facade
- High-level facade for ActonAI.
- kernel
- Kernel actor module.
- llm
- LLM provider module.
- memory
- Memory and persistence module for Acton-AI.
- messages
- Message types for inter-actor communication.
- prelude
- Prelude module for convenient imports
- prompt
- Fluent prompt builder for LLM requests.
- stream
- Stream handling for LLM responses.
- tools
- Tool system for the Acton-AI framework.
- types
- Core type definitions for the Acton-AI framework.