Codegate provides a command-line interface through cli.py with the following
structure:
codegate [OPTIONS] COMMAND [ARGS]...Start the Codegate server:
codegate serve [OPTIONS]-
--port INTEGER: Port to listen on (default: 8989)- Must be between 1 and 65535
- Overrides configuration file and environment variables
-
--host TEXT: Host to bind to (default: localhost)- Overrides configuration file and environment variables
-
--log-level [ERROR|WARNING|INFO|DEBUG]: Set the log level (default: INFO)- Case-insensitive
- Overrides configuration file and environment variables
-
--log-format [JSON|TEXT]: Set the log format (default: JSON)- Case-insensitive
- Overrides configuration file and environment variables
-
--config FILE: Path to YAML config file- Optional
- Must be a valid YAML file
- Configuration values can be overridden by environment variables and CLI options
-
--prompts FILE: Path to YAML prompts file- Optional
- Must be a valid YAML file
- Overrides default prompts and configuration file prompts
-
--vllm-url TEXT: vLLM provider URL (default: http://localhost:8000)- Optional
- Base URL for vLLM provider (/v1 path is added automatically)
- Overrides configuration file and environment variables
-
--openai-url TEXT: OpenAI provider URL (default: https://api.openai.com/v1)- Optional
- Base URL for OpenAI provider
- Overrides configuration file and environment variables
-
--anthropic-url TEXT: Anthropic provider URL (default: https://api.anthropic.com/v1)- Optional
- Base URL for Anthropic provider
- Overrides configuration file and environment variables
Display the loaded system prompts:
codegate show-prompts [OPTIONS]--prompts FILE: Path to YAML prompts file- Optional
- Must be a valid YAML file
- If not provided, shows default prompts from prompts/default.yaml
The CLI provides user-friendly error messages for:
- Invalid port numbers
- Invalid log levels
- Invalid log formats
- Configuration file errors
- Prompts file errors
- Server startup failures
All errors are output to stderr with appropriate exit codes.
Start server with default settings:
codegate serveStart server on specific port and host:
codegate serve --port 8989 --host localhostStart server with custom logging:
codegate serve --log-level DEBUG --log-format TEXTStart server with configuration file:
codegate serve --config my-config.yamlStart server with custom prompts:
codegate serve --prompts my-prompts.yamlStart server with custom vLLM endpoint:
codegate serve --vllm-url https://vllm.example.comShow default system prompts:
codegate show-promptsShow prompts from a custom file:
codegate show-prompts --prompts my-prompts.yaml