Transform natural language into system commands with AI π
Current Version: 0.1.3
A truly AI-powered command-line assistant for Windows, Linux, and macOS that understands natural language and executes system commands intelligently. No more memorizing complex command syntax - just ask in plain English!
Unlike traditional CLI tools with pattern matching, nlpcmd-ai uses AI/LLM (OpenAI GPT-4, Anthropic Claude, or local Ollama) to understand complex, ambiguous commands and translate them into executable system operations - automatically adapting commands to your operating system.
- π¦ PyPI Package: https://pypi.org/project/nlpcmd-ai/
- π» GitHub Repository: https://github.com/Avikg/nlp_terminal_cmd
- π Documentation: Full Documentation
- π Issues: Report Issues
- π§ True Natural Language Understanding - Uses AI models (OpenAI GPT-4, Anthropic Claude, or local Ollama)
- π Safe Execution - Smart confirmation for dangerous operations only
- π Context Awareness - Remembers conversation history for follow-up commands
- π― Intent Detection - Automatically determines what you want to do
- π§ Extensible - Easy to add custom command handlers
- π Cross-platform - Works seamlessly on Windows, Linux, and macOS
- π¬ Interactive Mode - Chat-like interface for complex workflows
- π 100% Free Option - Use with local Ollama (no API costs)
- β‘ Fast & Efficient - Powered by optimized AI models
- π Pure Python - No PowerShell issues, works everywhere
# 1. Install
pip install nlpcmd-ai
# 2. Setup (choose one - Ollama is free!)
ollama pull llama3.2 # Free option
# OR get OpenAI API key from https://platform.openai.com/api-keys
# 3. Use it!
python -m nlpcmd_ai.cli "what is my ip"- Python 3.8+ - Download Python
- pip (comes with Python)
pip install nlpcmd-aiYou have 3 options for the AI backend:
Run AI models locally on your computer - completely free, no API keys needed!
-
Install Ollama:
- Windows/Mac: Download from https://ollama.ai
- Linux:
curl https://ollama.ai/install.sh | sh
-
Download AI model:
ollama pull llama3.2
-
Create configuration file:
Windows:
echo NLP_PROVIDER=ollama > .env echo OLLAMA_MODEL=llama3.2 >> .env echo REQUIRE_CONFIRMATION=false >> .env
Linux/macOS:
cat > .env << EOF NLP_PROVIDER=ollama OLLAMA_MODEL=llama3.2 REQUIRE_CONFIRMATION=false EOF
-
Get API Key: https://platform.openai.com/api-keys
-
Create .env file:
# Windows echo NLP_PROVIDER=openai > .env echo OPENAI_API_KEY=your-api-key-here >> .env # Linux/macOS cat > .env << EOF NLP_PROVIDER=openai OPENAI_API_KEY=your-api-key-here EOF
-
Get API Key: https://console.anthropic.com/
-
Create .env file:
NLP_PROVIDER=anthropic ANTHROPIC_API_KEY=your-api-key-here
python -m nlpcmd_ai.cli "what is my ip"If it shows your IP address, you're all set! π
Just ask naturally - the AI will understand!
# System Information
python -m nlpcmd_ai.cli "what is my ip"
python -m nlpcmd_ai.cli "show mac address"
python -m nlpcmd_ai.cli "how much memory do I have"
python -m nlpcmd_ai.cli "show disk usage"
python -m nlpcmd_ai.cli "what's my CPU usage"
python -m nlpcmd_ai.cli "how long has my computer been running"
python -m nlpcmd_ai.cli "who am I"
# File Operations
python -m nlpcmd_ai.cli "list all python files"
python -m nlpcmd_ai.cli "find files larger than 10MB"
python -m nlpcmd_ai.cli "show directory structure"
python -m nlpcmd_ai.cli "show folders in C:"
python -m nlpcmd_ai.cli "find a.txt"
python -m nlpcmd_ai.cli "where is config.json"
# Network Commands
python -m nlpcmd_ai.cli "is port 8080 open"
python -m nlpcmd_ai.cli "ping google.com"
python -m nlpcmd_ai.cli "trace yahoo.com"
# Application Launching
python -m nlpcmd_ai.cli "open file explorer"
python -m nlpcmd_ai.cli "open browser"
# Get Help
python -m nlpcmd_ai.cli "help"
python -m nlpcmd_ai.cli "what can you do"Have a conversation with your terminal:
python -m nlpcmd_ai.cli -iThen chat naturally:
> what is my ip
> show disk usage
> list python files in current directory
> find files modified today
> help
> exit
# Auto-confirm (skip confirmation prompts)
python -m nlpcmd_ai.cli --yes "show disk usage"
# Dry run (see what would execute without running)
python -m nlpcmd_ai.cli --dry-run "delete all .log files"
# Help
python -m nlpcmd_ai.cli --helpWindows:
Create nlpai.bat in your PATH:
@echo off
python -m nlpcmd_ai.cli %*Then use:
nlpai "what is my ip"
nlpai -iLinux/macOS:
Add to ~/.bashrc or ~/.zshrc:
alias nlpai='python -m nlpcmd_ai.cli'Then:
nlpai "what is my ip"
nlpai -i- CPU usage, memory info, disk space - "show CPU usage", "how much RAM"
- System uptime, user info, hostname - "how long running", "who am I"
- Operating system details - Windows, Linux, macOS
- IP address lookup - "what is my ip", "show mac address"
- Port checking - "is port 8080 open"
- Network diagnostics - "ping google.com", "trace yahoo.com"
- Connectivity tests - Works without confirmation!
- List files - "list python files", "show folders in C:"
- Find files - "find a.txt", "where is config.json"
- Directory navigation - "show current folder", "folder structure"
- File search - "search for *.py files"
- List running processes - "show running processes"
- Find processes by name - "is python running"
- Monitor system resources - CPU, memory usage
- Open applications - "open file explorer", "open browser"
- Launch programs - Cross-platform support
- Get help - "help", "what can you do"
- Show capabilities - Lists all available commands
- Conversational queries - Friendly redirects to help
See SUPPORTED_COMMANDS.txt for complete list
# AI Provider (required)
NLP_PROVIDER=ollama # or "openai" or "anthropic"
# Provider-specific settings
OLLAMA_MODEL=llama3.2
OPENAI_API_KEY=your-key
ANTHROPIC_API_KEY=your-key
# Safety
REQUIRE_CONFIRMATION=false # Set to true for dangerous operations
# Logging
LOG_COMMANDS=true
LOG_FILE=~/.nlpcmd_ai/history.log- β Smart confirmation - Only dangerous operations require confirmation
- β Safe diagnostics - Network commands (ping, tracert) run without prompts
- β Dry-run mode to preview commands
- β Command logging for audit trail
- β Path validation for file operations
- β Protection against system directory access
- β Network diagnostics (ping, tracert, nslookup)
- β System info queries (CPU, memory, disk, uptime)
- β File searches and listings
- β Process listing
- β Read-only operations
β οΈ Deleting files/directoriesβ οΈ Stopping processesβ οΈ Modifying system filesβ οΈ Operations with sudo/admin privileges
$ python -m nlpcmd_ai.cli -i
> what's my CPU usage
CPU Information:
Usage: 23.5%
Cores: 8
Current Speed: 2400 MHz
> how much memory do I have
Memory Information:
Total: 16.00 GB
Available: 8.50 GB
Used: 7.50 GB
Usage: 46.9%
> show disk usage
Disk Usage for C:\:
Total: 476.94 GB
Used: 250.30 GB
Free: 226.64 GB
Usage: 52.5%$ python -m nlpcmd_ai.cli "find all python files"
π Category: file_operation
β‘ Action: find_files
π» Command: dir /s /b *.py
Found 15 files:
./nlpcmd_ai/engine.py
./nlpcmd_ai/handlers.py
./nlpcmd_ai/cli.py
...$ python -m nlpcmd_ai.cli "what is my ip"
Local IP: 192.168.1.100
Public IP: 203.0.113.45
$ python -m nlpcmd_ai.cli "show mac address"
Network Interface MAC Addresses:
Ethernet: 00-1A-2B-3C-4D-5E
Wi-Fi: 00-1F-2E-3D-4C-5B
$ python -m nlpcmd_ai.cli "is port 8080 open"
β Port 8080 is CLOSED on localhost$ python -m nlpcmd_ai.cli "where is config.json"
π» Command: dir /s /b config.json
C:\Development\nlpcmd\config.json
C:\Users\user\project\config.jsonCreate custom handlers for your specific needs:
# custom_handler.py
from nlpcmd_ai.base_handler import BaseHandler, CommandResult
class MyCustomHandler(BaseHandler):
def can_handle(self, category: str, action: str) -> bool:
return category == "my_custom_category"
def execute(self, command: str, parameters: dict, dry_run: bool = False) -> CommandResult:
# Your custom logic here
return CommandResult(success=True, output="Custom output")See examples/custom_handlers.py for more examples.
| Feature | nlpcmd-ai | Traditional CLI | Shell Scripts |
|---|---|---|---|
| Natural Language | β Yes | β No | β No |
| Cross-Platform | β Auto-adapts | ||
| Learning Curve | β None | β High | β High |
| AI-Powered | β Yes | β No | β No |
| Interactive | β Yes | β No | |
| Extensible | β Yes | β Yes | |
| No PowerShell Issues | β Yes | β Varies | β Varies |
- β Fixed MAC address retrieval using Python (psutil)
- β Improved port checking with better parameter extraction
- β Extended timeout for network commands (ping, tracert) - 120 seconds
- β Removed confirmation prompts for safe diagnostic commands
- β
Better file search with
dir /s /bpattern - β Help handler for conversational queries
- β Application launching support (file explorer, browser)
- β Enhanced AI prompt for better path handling
- β All system info commands use Python (no PowerShell dependencies)
- β Smart confirmation - only dangerous operations require confirmation
- β Fixed action parameter passing to handlers
- β Improved current directory vs specific path handling
Contributions are welcome! See CONTRIBUTING.md for guidelines.
# Clone repository
git clone https://github.com/Avikg/nlp_terminal_cmd.git
cd nlp_terminal_cmd
# Install in development mode
pip install -e .
# Install development dependencies
pip install -r requirements-dev.txt
# Run tests
pytestThis project is licensed under the MIT License - see the LICENSE file for details.
- Built with OpenAI GPT, Anthropic Claude, and Ollama
- Uses psutil for system information
- UI powered by Rich
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- PyPI: https://pypi.org/project/nlpcmd-ai/
If you find this project useful, please consider giving it a β on GitHub!
Made with β€οΈ by Avik and Abhinandan
Try it now: pip install nlpcmd-ai