An advanced AI-powered assistant for Odoo development that uses LangChain for modular LLM support, enabling easy switching between different language models.
- ๐ Multi-LLM Support: Switch between Gemini, OpenAI, and Anthropic models
- ๐ Structured Prompts: LangChain PromptTemplates for consistent, maintainable prompts
- ๐ Web Interface: Clean, responsive web UI with real-time updates
- ๐ฑ CLI Interface: Command-line interface for direct terminal usage
- ๐ง Module Creation: Generate complete Odoo modules with best practices
- โ๏ธ Module Customization: Modify existing modules intelligently
- ๐ Code Analysis: Comprehensive code quality and security analysis
- ๐ Debug Assistant: AI-powered debugging with suggested fixes
- ๐ Interactive Approval: Review all changes before applying
- ๐ฆ File Backup: Automatic backup of modified files
- ๐ฏ Odoo-Specific: Tailored for Odoo 15/16 development patterns
- Python 3.8+
- UV package manager
- Odoo development environment
- API key for your preferred LLM provider
-
Clone and setup the project:
cd /path/to/your/workspace git clone <your-repo> llama_agent cd llama_agent
-
Install dependencies with UV:
uv sync
-
Configure environment: Copy
.env.exampleto.envand configure:cp .env.example .env
-
Edit
.envfile:# LLM Configuration LLM_PROVIDER=gemini # Options: gemini, openai, anthropic LLM_TEMPERATURE=0.1 LLM_MAX_TOKENS=16384 # Gemini API Configuration GEMINI_API_KEY=your_gemini_api_key_here # OpenAI API Configuration (optional) # OPENAI_API_KEY=your_openai_api_key_here # OPENAI_MODEL=gpt-4 # Anthropic API Configuration (optional) # ANTHROPIC_API_KEY=your_anthropic_api_key_here # ANTHROPIC_MODEL=claude-3-5-sonnet-20241022 # Odoo Development Settings ODOO_ADDONS_PATH=/path/to/your/odoo/addons ODOO_VERSION=16.0 # Agent Configuration AGENT_MODE=development AUTO_APPROVE_CHANGES=false BACKUP_FILES=true
Start the web server:
uv run python web_ui.pyThen open http://localhost:8080 in your browser.
Web UI Features:
- ๐ง Create Module Tab: Generate new Odoo modules
- โ๏ธ Customize Module Tab: Modify existing modules
- ๐ Analyze Code Tab: Get detailed code analysis
- ๐ Debug Issue Tab: AI-powered debugging assistance
Run the command-line interface:
uv run python langchain_odoo_agent.pyCLI Options:
- ๐ Analyze existing code
- ๐ง Create new module
- โ๏ธ Customize existing module
- ๐ Debug Odoo issue
- โ Exit
LLM_PROVIDER=gemini
GEMINI_API_KEY=your_gemini_api_keyLLM_PROVIDER=openai
OPENAI_API_KEY=your_openai_api_key
OPENAI_MODEL=gpt-4 # or gpt-3.5-turboLLM_PROVIDER=anthropic
ANTHROPIC_API_KEY=your_anthropic_api_key
ANTHROPIC_MODEL=claude-3-5-sonnet-20241022llm_config.py- LLM provider management and configurationprompts.py- Structured prompt templates using LangChainlangchain_odoo_agent.py- Main agent logic with LangChain integrationweb_ui.py- Flask-based web interfacetemplates/index.html- Responsive web UI with Alpine.js
from langchain_core.prompts import PromptTemplate
from langchain_google_genai import ChatGoogleGenerativeAI
from llm_config import LLMConfig
from prompts import get_prompt_template
# Initialize LLM
llm_config = LLMConfig(provider="gemini")
llm = llm_config.get_llm()
# Get prompt template
prompt_template = get_prompt_template("module_creation")
# Create chain
chain = prompt_template | llm | str_parser
# Execute
response = chain.invoke({
"odoo_version": "16.0",
"module_name": "my_module",
"description": "My custom module",
"features": "Custom functionality"
})All prompts are now managed as LangChain PromptTemplate objects:
CODE_ANALYSIS_TEMPLATE- For code quality analysisMODULE_CREATION_TEMPLATE- For generating new modulesMODULE_CUSTOMIZATION_TEMPLATE- For modifying existing modulesDEBUG_ISSUE_TEMPLATE- For debugging assistance
| Variable | Description | Default | Options |
|---|---|---|---|
LLM_PROVIDER |
LLM provider to use | gemini |
gemini, openai, anthropic |
LLM_TEMPERATURE |
Model temperature | 0.1 |
0.0 - 2.0 |
LLM_MAX_TOKENS |
Maximum output tokens | 16384 |
Model-dependent |
ODOO_ADDONS_PATH |
Path to Odoo addons | /opt/odoo/addons |
Valid directory path |
ODOO_VERSION |
Target Odoo version | 16.0 |
15.0, 16.0, 17.0 |
AUTO_APPROVE_CHANGES |
Skip change approval | false |
true, false |
BACKUP_FILES |
Create file backups | true |
true, false |
| Variable | Description | Default |
|---|---|---|
UI_HOST |
Web server host | localhost |
UI_PORT |
Web server port | 8080 |
UI_DEBUG |
Flask debug mode | true |
llama_agent/
โโโ llm_config.py # LLM provider configuration
โโโ prompts.py # LangChain prompt templates
โโโ langchain_odoo_agent.py # Main agent with LangChain
โโโ web_ui.py # Flask web interface
โโโ templates/
โ โโโ index.html # Web UI template
โโโ pyproject.toml # UV project configuration
โโโ .env # Environment configuration
โโโ .env.example # Environment template
โโโ README.md # This file
-
Update
LLMProviderenum inllm_config.py:class LLMProvider(Enum): GEMINI = "gemini" OPENAI = "openai" ANTHROPIC = "anthropic" NEW_PROVIDER = "new_provider" # Add here
-
Add provider method in
LLMConfigclass:def _get_new_provider_llm(self) -> BaseLLM: api_key = os.getenv("NEW_PROVIDER_API_KEY") if not api_key: raise ValueError("NEW_PROVIDER_API_KEY not found") return NewProviderLLM( api_key=api_key, temperature=self.temperature, max_tokens=self.max_tokens )
-
Update
get_llm()method to handle the new provider.
-
Define the template in
prompts.py:NEW_TEMPLATE = PromptTemplate( input_variables=["param1", "param2"], template="""Your prompt template here with {param1} and {param2}""" )
-
Register in
get_prompt_template()function. -
Use in agent methods:
prompt_template = get_prompt_template("new_template") chain = prompt_template | self.llm | self.str_parser response = chain.invoke({"param1": value1, "param2": value2})
# CLI
uv run python langchain_odoo_agent.py
# Web UI
uv run python web_ui.py
# Then visit http://localhost:8080-
Test with Gemini:
export LLM_PROVIDER=gemini uv run python langchain_odoo_agent.py -
Test with OpenAI:
export LLM_PROVIDER=openai export OPENAI_API_KEY=your_key uv run python langchain_odoo_agent.py
-
Test with Anthropic:
export LLM_PROVIDER=anthropic export ANTHROPIC_API_KEY=your_key uv run python langchain_odoo_agent.py
-
"LLM provider not found"
- Check
LLM_PROVIDERvalue in.env - Ensure the provider's API key is configured
- Check
-
"Failed to parse JSON response"
- LLM response may be truncated
- Try increasing
LLM_MAX_TOKENS - Check LLM provider rate limits
-
"Module path not found"
- Verify
ODOO_ADDONS_PATHpoints to correct directory - Ensure path permissions are correct
- Verify
-
Web UI not loading
- Check if port 8080 is available
- Try changing
UI_PORTin.env - Check firewall settings
Enable detailed logging:
export UI_DEBUG=true
uv run python web_ui.pyThis project is licensed under the MIT License.
- Fork the repository
- Create a feature branch
- Add your changes with tests
- Submit a pull request
For issues and questions:
- Open an issue on GitHub
- Check the troubleshooting section
- Review the LangChain documentation for advanced usage
Happy Odoo Development! ๐