A Python AI agent built with LangChain that supports multiple model providers including Anthropic Claude, OpenAI GPT, and Ollama models. The agent provides a unified interface for tool calling across different model providers.
- Multi-provider support: Anthropic, OpenAI compatible APIs, and Ollama
- Unified tool interface: Same tool definitions work across all model providers
- Predefined presets: Quick setup with common model configurations
- Flexible configuration: Custom model parameters and endpoints
- Interactive CLI: Real-time conversation with tool calling capabilities
- Install dependencies:
pip install -r requirements.txt- Set up API keys (as needed):
export ANTHROPIC_API_KEY="your-anthropic-key"
export OPENAI_API_KEY="your-openai-key"The easiest way to get started is with predefined model presets:
# Anthropic Claude models
python main.py --preset claude-3.5-sonnet
python main.py --preset claude-3-haiku
# OpenAI models
python main.py --preset gpt-4-turbo
python main.py --preset gpt-3.5-turbo
# Ollama models (requires Ollama running locally)
python main.py --preset llama3.1-8b
python main.py --preset llama3.1-70b
python main.py --preset mistral-7b
python main.py --preset codellama-7bFor more control, use custom provider configurations:
# Anthropic with custom parameters
python main.py --provider anthropic --model claude-3-5-sonnet-20241022 --temperature 0.7
# OpenAI with custom API key
python main.py --provider openai --model gpt-4-turbo-preview --api-key your-key
# Custom OpenAI-compatible endpoint
python main.py --provider openai --model custom-model --base-url https://api.custom.com/v1
# Ollama with custom endpoint
python main.py --provider ollama --model llama3.1:8b --base-url http://localhost:11434
# With additional parameters
python main.py --provider anthropic --model claude-3-5-sonnet-20241022 --temperature 0.7 --max-tokens 2048
# Disable SSL verification (useful for self-signed certificates)
python main.py --preset llama3.1-8b --no-verify-ssl
python main.py --provider ollama --model llama3.1:8b --base-url https://my-ollama:11434 --no-verify-ssl--preset: Use a predefined model preset (mutually exclusive with --provider)--provider: Model provider (anthropic, openai, ollama) - requires --model
--model: Model name (required when using --provider)--api-key: API key for the model provider--base-url: Base URL for the API (useful for custom OpenAI endpoints or Ollama)--temperature: Temperature for text generation (default: 0.0)--max-tokens: Maximum tokens in response (default: 1024)--no-verify-ssl: Disable SSL certificate verification (useful for self-signed certificates)
ANTHROPIC_API_KEY: Your Anthropic API keyOPENAI_API_KEY: Your OpenAI API keyOPENAI_BASE_URL: Custom OpenAI-compatible endpoint (optional)OLLAMA_BASE_URL: Ollama server URL (default: http://localhost:11434)
- Get an API key from Anthropic Console
- Set the environment variable:
export ANTHROPIC_API_KEY="your-key" - Available models:
claude-3-5-sonnet-20241022,claude-3-haiku-20240307, etc.
- Get an API key from OpenAI Platform
- Set the environment variable:
export OPENAI_API_KEY="your-key" - Available models:
gpt-4-turbo-preview,gpt-3.5-turbo, etc.
- Install Ollama from ollama.com
- Start the Ollama server:
ollama serve - Pull a model:
ollama pull llama3.1:8b - Available models: Any model supported by your Ollama installation
The agent comes with these built-in tools:
- read_file: Read the contents of a file
- list_files: List files and directories in a path
To add custom tools, implement the ToolDefinition interface:
from agent import ToolDefinition
class MyCustomTool(ToolDefinition):
def name(self) -> str:
return "my_tool"
def description(self) -> str:
return "Description of what my tool does"
def input_schema(self) -> dict:
return {
"type": "object",
"properties": {
"param": {"type": "string", "description": "Parameter description"}
},
"required": ["param"]
}
def execute(self, **kwargs) -> str:
# Tool implementation
return "Tool result"Then add it to your agent:
from agent import MultiModelAgent, ModelFactory, ModelPresets, ModelProvider
from tools import ReadFileTool, ListFilesTool
# Create model
model = ModelFactory.create_model(ModelPresets.CLAUDE_3_5_SONNET)
# Add tools including your custom tool
tools = [ReadFileTool(), ListFilesTool(), MyCustomTool()]
# Create agent
agent = MultiModelAgent(model, ModelProvider.ANTHROPIC, get_user_message, tools)You: List the files in the current directory
Agent: [uses list_files tool]
You: Read the contents of main.py
Agent: [uses read_file tool]
You: Analyze the structure of this Python project
Agent: [uses list_files to explore, then read_file to examine key files]
- ModelFactory: Creates model instances for different providers
- MultiModelAgent: Main agent class handling conversation and tool execution
- ToolDefinition: Interface for implementing custom tools
- Provider-specific handling: Different tool calling formats for each provider
- Fork the repository
- Create a feature branch
- Add your changes with tests
- Submit a pull request
MIT License - see LICENSE file for details.