The configuration system in Codegate is managed through the Config class in config.py. It supports multiple configuration sources with a clear priority order.
- CLI arguments
- Environment variables
- Config file (YAML)
- Default values (including default prompts from prompts/default.yaml)
- Port: 8989
- Host: "localhost"
- Log Level: "INFO"
- Log Format: "JSON"
- Prompts: Default prompts from prompts/default.yaml
- Provider URLs:
- vLLM: "http://localhost:8000"
- OpenAI: "https://api.openai.com/v1"
- Anthropic: "https://api.anthropic.com/v1"
Load configuration from a YAML file:
config = Config.from_file("config.yaml")Example config.yaml:
port: 8989
host: localhost
log_level: INFO
log_format: JSON
provider_urls:
vllm: "https://vllm.example.com"
openai: "https://api.openai.com/v1"
anthropic: "https://api.anthropic.com/v1"Environment variables are automatically loaded with these mappings:
CODEGATE_APP_PORT: Server portCODEGATE_APP_HOST: Server hostCODEGATE_APP_LOG_LEVEL: Logging levelCODEGATE_LOG_FORMAT: Log formatCODEGATE_PROMPTS_FILE: Path to prompts YAML fileCODEGATE_PROVIDER_VLLM_URL: vLLM provider URLCODEGATE_PROVIDER_OPENAI_URL: OpenAI provider URLCODEGATE_PROVIDER_ANTHROPIC_URL: Anthropic provider URL
config = Config.from_env()Provider URLs can be configured in several ways:
-
In Configuration File:
provider_urls: vllm: "https://vllm.example.com" # /v1 path is added automatically openai: "https://api.openai.com/v1" anthropic: "https://api.anthropic.com/v1"
-
Via Environment Variables:
export CODEGATE_PROVIDER_VLLM_URL=https://vllm.example.com export CODEGATE_PROVIDER_OPENAI_URL=https://api.openai.com/v1 export CODEGATE_PROVIDER_ANTHROPIC_URL=https://api.anthropic.com/v1
-
Via CLI Flags:
codegate serve --vllm-url https://vllm.example.com
Note: For the vLLM provider, the /v1 path is automatically appended to the base URL if not present.
Available log levels (case-insensitive):
ERRORWARNINGINFODEBUG
Available log formats (case-insensitive):
JSONTEXT
Prompts can be configured in several ways:
-
Default Prompts:
- Located in prompts/default.yaml
- Loaded automatically if no other prompts are specified
-
In Configuration File:
# Option 1: Direct prompts definition prompts: my_prompt: "Custom prompt text" another_prompt: "Another prompt text" # Option 2: Reference to prompts file prompts: "path/to/prompts.yaml"
-
Via Environment Variable:
export CODEGATE_PROMPTS_FILE=path/to/prompts.yaml -
Via CLI Flag:
codegate serve --prompts path/to/prompts.yaml
Prompts files should be in YAML format with string values:
prompt_name: "Prompt text content"
another_prompt: "More prompt text"Access prompts in code:
config = Config.load()
prompt = config.prompts.prompt_nameThe configuration system uses a custom ConfigurationError exception for handling configuration-related errors, such as:
- Invalid port numbers (must be between 1 and 65535)
- Invalid log levels
- Invalid log formats
- YAML parsing errors
- File reading errors
- Invalid prompt values (must be strings)
- Missing or invalid prompts files