A tool that uses AI to automatically generate high-quality git commit messages based on your staged changes. Supports multiple LLM providers including Anthropic's Claude, OpenAI's GPT models, and Google's Gemini.
- 🤖 Uses AI to analyze code changes and generate meaningful commit messages
- 🧠 Supports multiple LLM providers: Anthropic Claude, OpenAI GPT, and Google Gemini
- 📝 Follows best practices for commit messages (conventional commits, imperative mood)
- 🚀 Single binary distribution with no dependencies
- 🔐 Securely stores API keys in your system's credential manager
- ✏️ Interactive mode allows you to edit messages before committing
- 🔄 Integrates directly into your git workflow
- 💻 Cross-platform support for macOS and Windows
- 📊 Configurable context levels for accurate commit messages
# Clone the repository
git clone https://github.com/nycjay/ai-commit-msg.git
cd ai-commit-msg
# Basic build (builds for all platforms)
./build.sh
# Tests run by default with all builds
./build.sh
# Build with a specific version
./build.sh --version 1.2.3
# Build only for current platform
./build.sh --single-platform
# Run tests only without building
./build.sh --test-only
# Build and create a symlink in /usr/local/bin (macOS/Linux only)
./build.sh --symlinkThe build.sh script provides several options to customize the build process:
-
--version VERSION: Specify a custom version number- If not provided, reads version from
VERSIONfile - Falls back to
0.1.0if no version is found
- If not provided, reads version from
-
--test: Run unit tests explicitly (tests run by default)- Tests are run automatically with every build unless
--skip-testsis specified - Checks compilation first
- Stops the build if any tests fail
- Provides colorful test result output with coverage information
- Tests are run automatically with every build unless
-
--test-only: Only run tests without building- Useful for quick verification of code changes
-
--skip-tests: Skip running tests (use when you want a faster build)- By default, the script runs all unit tests before building
- This flag bypasses the test phase completely
-
--single-platform: Build only for the current platform- By default, the script builds for multiple platforms (macOS Intel, macOS Apple Silicon, and Windows)
-
--symlink: Create a symlink in /usr/local/bin (macOS/Linux only)- Makes the tool available globally in your PATH
- Requires sudo privileges to create the symlink
- Not applicable on Windows systems
-
--help: Display help information about build script options
The project uses a VERSION file to track the current version of the tool:
- The version is automatically read from the
VERSIONfile during build - To update the version, edit the
VERSIONfile - Example version update:
echo "1.0.0" > VERSION # Update to version 1.0.0 ./build.sh # Build with the new version for all platforms
You can also override the version temporarily during build:
# Build with a specific version without modifying the VERSION file
./build.sh --version 1.2.3When releasing a new version:
- Update the
VERSIONfile - Create a git tag matching the version
- Build for all platforms:
./build.sh - Distribute the binaries from the
bin/directory:bin/ai-commit-msg-darwin-amd64(macOS Intel)bin/ai-commit-msg-darwin-arm64(macOS Apple Silicon)bin/ai-commit-msg-windows-amd64.exe(Windows)bin/ai-commit-msg-linux-amd64(Linux, if built on Linux)
You'll need an API key from one of the supported providers (Anthropic, OpenAI, or Gemini). The setup process is simple:
Simply run the tool without any parameters:
ai-commit-msgThe tool will prompt you for your API key (input will be hidden) and offer to store it securely in your system's credential manager.
# For Anthropic Claude
./ai-commit-msg --store-key --key your-anthropic-key-here
# For OpenAI
./ai-commit-msg --provider openai --store-key --key your-openai-key-here
# For Gemini
./ai-commit-msg --provider gemini --store-key --key your-gemini-key-here# For Anthropic Claude
export ANTHROPIC_API_KEY="your-anthropic-key-here"
# For OpenAI
export OPENAI_API_KEY="your-openai-key-here"
# For Gemini
export GEMINI_API_KEY="your-gemini-key-here"- Stage your changes with
git add - Run the tool:
ai-commit-msg
- Review the suggested commit message
- Choose to use it (y), edit it (e), or cancel (n)
--key "your-api-key" Specify your Anthropic API key
--store-key Store the provided API key in your system's credential manager
--auto Automatically commit using the generated message without confirmation
-v Enable verbose output (level 1)
-vv Enable more verbose output (level 2)
-vvv Enable debug output including full prompts (level 3)
-c, --context N Number of context lines to include in the diff (default: 3)
-cc Include more context lines (10)
-ccc Include maximum context (entire file)
-p, --provider NAME Specify LLM provider to use (anthropic, openai, gemini)
-m, --model MODEL Specify model to use (provider-specific)
--list-providers List available providers
--list-models List available models for selected provider
--system-prompt PATH Specify a custom system prompt file path
--user-prompt PATH Specify a custom user prompt file path
--remember Remember command-line options in config for future use
--help Display help information
Subcommands:
init-prompts Initialize custom prompt files in your config directory
show-config Display the current configuration
list-providers List all supported AI providers
list-models List available models (optionally for a specific provider)
Subcommand Details:
- `list-providers`:
Shows all supported AI providers for generating commit messages
- Usage: `ai-commit-msg list-providers`
- `list-models`:
Lists available models for all providers or for a specific provider
- List models for all providers: `ai-commit-msg list-models`
- List models for a specific provider: `ai-commit-msg list-models anthropic`
- `show-config`:
Displays the current configuration settings, including:
- Configuration directory
- Current provider and model
- Context lines
- Verbosity level
- `init-prompts`:
Initializes custom prompt files in your configuration directory
- Usage: `ai-commit-msg init-prompts`
Examples:
ai-commit-msg list-providers # List all available providers
ai-commit-msg list-models # List models for all providers
ai-commit-msg list-models anthropic # List models only for Anthropic
ai-commit-msg show-config # Show current configuration
ai-commit-msg init-prompts # Initialize custom prompt files details and settings
Control how much code context is included in your commit messages:
- Default (
ai-commit-msg): 3 lines of context (git default) - Medium context (
ai-commit-msg -cc): 10 lines of context - Enhanced context (
ai-commit-msg -ccc): Full file content plus enhanced analysis of code structure, project context, and related files - Custom context (
ai-commit-msg --context 8): Specify exact number of lines
The enhanced context mode (-ccc) provides comprehensive analysis by including file summaries, commit history, and project context for generating high-quality commit messages.
The tool supports multiple verbosity levels to provide more detailed information during operation:
- Silent (default): Only shows essential output and prompts
- Level 1 (
-v): Shows basic operation logs, including file counts and timing - Level 2 (
-vv): Shows more detailed logs with intermediate steps, file statistics, and branch information - Level 3 (
-vvv): Shows debug-level information including the full prompts sent to the AI and detailed API responses
Use higher verbosity levels when:
- Troubleshooting issues with the tool
- Understanding exactly what data is being sent to the AI
- Diagnosing problems with API responses
- Seeing detailed information about your git repository and changes
The tool uses a flexible configuration system with the following precedence (highest to lowest):
- Command-line arguments
- Environment variables
- Configuration file
- Default values
Configuration is stored in the following locations, with paths prioritized based on platform conventions:
- All Platforms:
$XDG_CONFIG_HOME/ai-commit-msg/config.toml(if XDG_CONFIG_HOME is set)
- macOS:
~/.config/ai-commit-msg/config.toml~/Library/Application Support/ai-commit-msg/config.toml
- Windows:
%APPDATA%\ai-commit-msg\config.toml~/.config/ai-commit-msg/config.toml%USERPROFILE%\.config\ai-commit-msg\config.toml
- Fallback:
~/.config/ai-commit-msg/config.toml
The configuration path is dynamically selected for macOS, Windows, Linux, and other Unix-like systems, with XDG_CONFIG_HOME taking precedence across all platforms when set, ensuring broad compatibility and flexible configuration.
You can persist command-line options to the configuration file using the --remember flag:
# Remember to always use more context lines and the opus model
ai-commit-msg -cc --model claude-3-opus-20240229 --rememberNote that only settings that make sense across multiple commits are persisted:
- Persisted: Verbosity level, context lines, model name
- Not persisted: Jira issue ID, Jira description, auto-commit flag, store key flag (these are commit-specific or one-time operations)
- Hardcoded: Jira prefixes (GTN, GTBUG, TOOLS, TASK)
Environment variables use the prefix AI_COMMIT_:
export AI_COMMIT_VERBOSITY=2 # Set verbosity level
export AI_COMMIT_CONTEXT_LINES=5 # Set context lines
export AI_COMMIT_PROVIDER=openai # Set provider (anthropic, openai, gemini)
export AI_COMMIT_MODEL_NAME=gpt-4 # Set model name
export AI_COMMIT_SYSTEM_PROMPT_PATH="/path/to/system_prompt.txt" # Custom system prompt
export AI_COMMIT_USER_PROMPT_PATH="/path/to/user_prompt.txt" # Custom user promptThe configuration system is designed to be:
- Non-intrusive: Sensitive information (like API keys) is never stored in config files
- Persistent: Remember your preferences between runs
- Flexible: Multiple ways to configure based on your needs
Generate a commit message with interactive prompt:
ai-commit-msgGenerate with more context lines:
ai-commit-msg -ccGenerate with a specific number of context lines:
ai-commit-msg --context 8Generate and automatically commit:
ai-commit-msg -aGenerate with different levels of verbosity:
ai-commit-msg -v # Basic verbose output
ai-commit-msg -vv # More detailed output with intermediate steps
ai-commit-msg -vvv # Debug level output including full promptsStore API key in credential manager:
ai-commit-msg --store-key --key sk_ant_your_key_hereSelect a different AI provider:
ai-commit-msg --provider openai # Use OpenAI (GPT) models
ai-commit-msg --provider gemini # Use Google's Gemini models
ai-commit-msg --provider anthropic # Use Anthropic's Claude modelsUse different models:
ai-commit-msg --provider anthropic --model claude-3-opus-20240229 # Use Claude Opus
ai-commit-msg --provider openai --model gpt-4 # Use GPT-4
ai-commit-msg --provider gemini --model gemini-1.5-pro # Use Gemini ProList available providers and models:
ai-commit-msg --list-providers # List all supported providers
ai-commit-msg --list-models # List all available models for each providerInitialize custom prompt files:
ai-commit-msg init-promptsUse custom prompt files:
ai-commit-msg --system-prompt /path/to/system_prompt.txt --user-prompt /path/to/user_prompt.txtRemember settings for future use:
ai-commit-msg -cc --model claude-3-opus-20240229 --rememberView current configuration:
ai-commit-msg show-config # Display the current configuration detailsDisplay version:
ai-commit-msg --version # Show the tool's version informationYou can create a git alias for easier access:
git config --global alias.claude '!ai-commit-msg'Then use:
git claudeThe tool supports multiple Large Language Model (LLM) providers, making it flexible to work with your preferred AI service:
- Anthropic Claude: High-quality language models with strong reasoning capabilities
- OpenAI: Support for GPT models, including GPT-4 and GPT-3.5 Turbo
- Gemini: Support for Google's Gemini models
You can select the provider to use with the --provider flag:
ai-commit-msg --provider anthropic # Use Anthropic Claude
ai-commit-msg --provider openai # Use OpenAI GPT
ai-commit-msg --provider gemini # Use Google GeminiEach provider has its own set of available models. You can list all providers and their models with:
ai-commit-msg list-providers # List all supported providers
ai-commit-msg list-models # List models for all providers
ai-commit-msg list-models anthropic # List models for a specific providerAnd select a specific model with:
ai-commit-msg --provider anthropic --model claude-3-opus-20240229
ai-commit-msg --provider openai --model gpt-4
ai-commit-msg --provider gemini --model gemini-1.5-proEach provider has different models available. You can list all providers and their models with:
Each provider requires its own API key. You can use environment variables:
export ANTHROPIC_API_KEY="your-anthropic-key"
export OPENAI_API_KEY="your-openai-key"
export GEMINI_API_KEY="your-gemini-key"Or store them securely in your system's credential manager:
ai-commit-msg --provider anthropic --store-key --key your-anthropic-key
ai-commit-msg --provider openai --store-key --key your-openai-key
ai-commit-msg --provider gemini --store-key --key your-gemini-keyThe tool will securely store each provider's API key separately, so you can easily switch between providers without re-entering keys.
When running the tool for the first time, you'll be guided through provider selection and API key setup:
ai-commit-msgThe interactive setup will:
- Ask you to select your preferred provider
- Provide guidance on where to obtain an API key
- Allow you to securely store the key in your system's credential manager
The tool uses carefully crafted prompts to generate commit messages. You can customize these prompts to change how the messages are generated:
# Initialize the prompt templates in your config directory
ai-commit-msg init-promptsThis creates three files in your config directory:
~/.config/ai-commit-msg/prompts/
├── system_prompt.txt # Instructions for the LLM
├── user_prompt.txt # Template for standard context
└── enhanced_user_prompt.txt # Template for enhanced context
You can edit these files to customize:
- The commit message format and style
- The level of detail in explanations
- How Jira IDs are handled
- Additional instructions for the AI
Alternatively, you can specify custom prompt files from any location:
ai-commit-msg --system-prompt /path/to/system_prompt.txt --user-prompt /path/to/user_prompt.txtYou can easily switch between providers in your workflow:
# Default workflow with Anthropic
ai-commit-msg
# Switch to OpenAI for a specific commit
ai-commit-msg --provider openai
# Switch to Gemini with a specific model
ai-commit-msg --provider gemini --model gemini-1.5-pro
# Set OpenAI as your preferred provider
ai-commit-msg --provider openai --rememberThe --remember flag will update your configuration to use the specified provider for future runs.
Once you've chosen your preferred provider, you can set it as your default for future commit messages:
ai-commit-msg --provider openai --rememberThe build script builds binaries for all major platforms by default:
- macOS Intel (AMD64):
bin/ai-commit-msg-darwin-amd64 - macOS Apple Silicon (ARM64):
bin/ai-commit-msg-darwin-arm64 - Windows:
bin/ai-commit-msg-windows-amd64.exe - Linux (when building on Linux):
bin/ai-commit-msg-linux-amd64
After building, the script automatically:
- Places platform-specific binaries in the
bin/directory - Copies the appropriate binary for your current platform to the project root
- Makes the binary executable
To build for only your current platform, use the --single-platform flag:
./build.sh --single-platformTo create a symlink in /usr/local/bin (macOS/Linux only) for global access:
./build.sh --symlinkThe tool automatically detects your operating system and uses the appropriate credential manager:
- macOS: Uses the macOS Keychain via the
securitycommand-line tool - Windows: Uses the Windows Credential Manager via the
wincredpackage - Linux: No native credential store support, but environment variables work instead
On systems without a supported credential manager, the tool will automatically fall back to using environment variables and provide appropriate guidance.
The tool comes preconfigured with the following Jira issue type prefixes:
- GTN
- GTBUG
- TOOLS
- TASK
If your organization uses different Jira issue type prefixes, you can add them by modifying the pkg/git/jira.go file:
// Example: Adding custom Jira prefixes
var JiraPrefixes = []string{
"GTN",
"GTBUG",
"TOOLS",
"TASK",
"FEAT", // Added custom prefix
"PROJ-A", // Added custom prefix
"BUG", // Added custom prefix
}After adding your custom prefixes, rebuild the tool with ./build.sh.
The tool uses carefully crafted prompts to generate commit messages. You can customize these prompts to change how the messages are generated:
system_prompt.txt- Instructions for the LLM about commit message style and formattinguser_prompt.txt- Template for git diff information (standard context)enhanced_user_prompt.txt- Template for enhanced context mode
You can customize the prompts to change how commit messages are generated:
-
Initialize the default templates:
ai-commit-msg init-prompts
This will create prompt files in your config directory (
~/.config/ai-commit-msg/prompts/). -
Specify custom prompt file paths:
ai-commit-msg --system-prompt /path/to/system_prompt.txt --user-prompt /path/to/user_prompt.txt
-
Set custom prompt paths in the config file:
system_prompt_path = "/path/to/system_prompt.txt" user_prompt_path = "/path/to/user_prompt.txt"
-
Use environment variables:
export AI_COMMIT_SYSTEM_PROMPT_PATH="/path/to/system_prompt.txt" export AI_COMMIT_USER_PROMPT_PATH="/path/to/user_prompt.txt"
The tool follows this order of precedence when looking for prompt files:
- Custom paths specified via command line flags
- Custom paths specified in the config file or environment variables
- Files in the user's config directory (
~/.config/ai-commit-msg/prompts/) - Default files in the tool's installation directory
- To change the commit message style (e.g., different format, more/less detail)
- To adapt messages for team-specific conventions
- To add specific requirements for your project
- To optimize the prompts for your specific workflow
The project automatically runs all unit tests as part of the build process:
# Regular build runs tests by default
./build.sh
# Skip tests during build
./build.sh --skip-tests
# Run only tests without building
./build.sh --test-onlyThe build script provides a comprehensive test report with coverage statistics for all packages.
You can also run tests manually using Go's testing tools:
# Run all tests
go test ./...
# Run tests with verbose output
go test -v ./...
# Run tests for a specific package
go test ./pkg/keyThe project is organized into several packages:
- cmd/ai-commit-msg: Main application entry point
- pkg/config: Configuration management using Viper
- pkg/key: API key management with cross-platform credential store support
- pkg/git: Git operations and diff processing
- pkg/ai: LLM provider integration
The configuration system uses the Viper library to provide a flexible configuration experience:
- Multiple sources: Configuration can come from files, environment variables, and command-line flags
- Automatic binding: Environment variables are automatically mapped to configuration fields
- Persistence: Configuration can be saved to disk for future sessions
The key management system provides secure storage of API keys:
- Platform detection: Automatically detects the platform and uses the appropriate credential store
- macOS support: Uses the macOS Keychain
- Windows support: Uses the Windows Credential Manager
- Environment fallback: Falls back to environment variables when credential store is unavailable
- Stage only related changes together for more focused commit messages
- Use the
-ccor-cccflags for more detailed commit messages - For large changes, consider breaking them into smaller, logical commits