PoshLLM brings the power of Large Language Models (LLMs) directly to your PowerShell REPL command line. Interact with local LLM systems like Ollama seamlessly from your PowerShell session with intuitive commands and convenient aliases.
- π Direct LLM Integration - Connect to local LLM systems (Ollama) from PowerShell
- π¬ Interactive Prompting - Ask questions and get intelligent responses instantly
- π§ Code Generation - Generate PowerShell code with LLM assistance
- β‘ Quick Aliases - Use
ai,llm, oraskfor fast interaction - π¨ Syntax Highlighting - Beautiful PowerShell code highlighting in responses
- π Code Execution - Option to execute generated code directly or copy to clipboard
- βοΈ Flexible Configuration - Configure LLM system, model, URL, and context size
- π― Parameter Overrides - Override settings per command as needed
- PowerShell 7.0 or later
- Local LLM System (currently supports Ollama)
- Windows, macOS, or Linux
Install-Module -Name PoshLLM -Scope CurrentUsergit clone https://github.com/DevPossible/PoshLLM.git
cd PoshLLM
Import-Module ./PoshLLM.psd1Import-Module PoshLLM# Use defaults (Ollama on localhost:11434)
Set-PoshLLMConfiguration
# Or specify custom settings for Ollama
Set-PoshLLMConfiguration -LLMSystem "ollama" -Model "llama3:latest" -Location "http://localhost:11434"
# Or configure for ClaudeCode CLI
Set-PoshLLMConfiguration -LLMSystem "claudecode" -Model "claude-3.5-sonnet" -Location "claude" -ApiKey "your-api-key"# Ask a question
ai "What is PowerShell?"
# Get code generation help
llm "Write a function to compress files in a directory"
# Use the ask alias for natural queries
ask "How do I list running processes sorted by memory usage?"The module comes with sensible defaults:
- LLM System: ollama
- Location: http://localhost:11434 (for ollama) or "claude" (for claudecode)
- Context Size: 4096 (maximum 65536 bytes / 64KB)
Configure once and use throughout your session:
# Minimal configuration (uses defaults)
Set-PoshLLMConfiguration
# Full configuration for Ollama
Set-PoshLLMConfiguration -LLMSystem "ollama" `
-Model "mistral:latest" `
-Location "http://localhost:11434" `
-ContextSize 8192
# Configuration for ClaudeCode
Set-PoshLLMConfiguration -LLMSystem "claudecode" `
-Model "claude-3.5-sonnet" `
-Location "claude" `
-ApiKey "your-api-key"
# Using the backward-compatible alias
Configure-PoshLLM -Model "llama3:8b"# General questions
ai "Explain what .NET assemblies are"
# PowerShell-specific help
llm "How do I parse JSON in PowerShell?"
# Command suggestions
ask "What's the best way to handle errors in PowerShell scripts?"Use -IncludeContext to help the LLM understand what went wrong:
PS C:\> git clone bob
fatal: repository 'bob' does not exist
PS C:\> ai "why did this command fail" -IncludeContext 2 -ResponseType Text
LLM Response:
The command `git clone bob` failed because the `git clone` command requires
a valid repository URL or path to clone from. The argument `bob` is not a
valid URL or existing repository path.
To fix this, provide a valid Git repository URL or a local path to a repository.This example shows how PoshLLM can analyze your recent command history to provide context-aware explanations.
When PoshLLM detects code in the response, you'll be prompted with options:
llm "Write a function to list all running services"
# You'll see highlighted code and options to:
# [C] Copy to clipboard (default) - Copy the code to your clipboard
# [E] Execute - Run the code immediately
# [F] Fix - Ask the LLM to review and fix any issues with the code
# [A] Alternate - Request a different solution from the LLM
# [R] Redirect - Ask the LLM to modify the code or answer questions about it
# [X] Exit - Just display without actionFix Option: If the generated code has issues or you want the LLM to review it:
# After seeing the code, select [F]
# The LLM will analyze the code and provide a corrected versionAlternate Option: Get different approaches to the same problem:
# After seeing the code, select [A]
# The LLM will provide a different solution
# You can select [A] multiple times to see various approaches
# Each previous solution is tracked to avoid duplicatesRedirect Option: Customize or query the LLM about the generated code:
# After seeing the code, select [R]
# Enter your request, such as:
# - "add error handling"
# - "add comments explaining each step"
# - "make it work with pipeline input"
# - "explain what this code does"
# The LLM will respond based on your request# Use a different configuration for one query
Invoke-LLM -Prompt "Explain recursion" -Config @{Model = "codellama:latest"}
# Override with a different LLM system temporarily
Invoke-LLM -Prompt "Write a Get function" -Config @{LLMSystem = "claudecode"; Model = "claude-3.5-sonnet"}
# Control response format
Invoke-LLM -Prompt "List 10 boy names" -ResponseType Data -DataFormat CSV
Invoke-LLM -Prompt "Get running processes" -ResponseType Script
Invoke-LLM -Prompt "What is PowerShell?" -ResponseType Text
# Include console context for error analysis
ai "why did this command fail?" -IncludeContext 5
# Get raw response for scripting
$response = ai "Create a function Get-LargeFiles" -Raw
# Preview the prompt without sending it
$prompt = ai "list files" -IncludeContext 5 -GetPrompt| Command | Alias | Description |
|---|---|---|
Invoke-LLM |
ai, llm, ask |
Send prompts to the LLM and process responses |
Set-PoshLLMConfiguration |
Configure-PoshLLM |
Configure LLM connection settings |
Get-PoshLLMInfo |
- | Display module information and current configuration |
Show-SyntaxHighlightedCode |
- | Display PowerShell code with syntax highlighting |
# Example: Generate and execute code dynamically
$code = ai "Create a function Get-LargeFiles that finds files over 100MB" -Raw
Invoke-Expression $code
Get-LargeFiles -Path "C:\Users"# Multi-line prompts
$prompt = @"
I need a PowerShell function that:
1. Accepts a directory path
2. Recursively searches for .log files
3. Filters files older than 30 days
4. Returns file objects with size and age
"@
llm $prompt# Add to your PowerShell profile for instant access
if (Get-Module -ListAvailable -Name PoshLLM) {
Import-Module PoshLLM
Set-PoshLLMConfiguration -Model "codellama:latest"
}PoshLLM supports two LLM systems: Ollama (local) and ClaudeCode (Anthropic's Claude via CLI).
Ollama runs LLMs locally on your machine for complete privacy and offline use.
- Download from ollama.ai
- Install following platform-specific instructions
- Pull a model:
ollama pull llama3:latest ollama pull codellama:latest ollama pull qwen3:8b
- Verify it's running:
ollama list
# Use defaults (Ollama on localhost:11434)
Set-PoshLLMConfiguration
# Or specify a custom model
Set-PoshLLMConfiguration -LLMSystem "ollama" -Model "llama3:latest"
# Or use a remote Ollama instance
Set-PoshLLMConfiguration -LLMSystem "ollama" -Model "mistral:latest" -Location "http://remote-server:11434"- codellama - Best for code generation
- llama3 - Great all-around model
- qwen3 - Fast and efficient
- mistral - Good balance of speed and capability
ClaudeCode CLI connects to Anthropic's Claude models via their API.
-
Install via npm:
npm install -g @anthropic-ai/claude-code
-
Get your API key from Anthropic Console
-
Verify installation:
claude --version
# Configure with API key
Set-PoshLLMConfiguration -LLMSystem "claudecode" `
-Model "claude-3.5-sonnet" `
-Location "claude" `
-ApiKey "your-api-key-here"- claude-3.5-sonnet - Most capable model, best for complex tasks
- claude-3-opus - Powerful model for demanding tasks
- claude-3-sonnet - Balanced performance and speed
- claude-3-haiku - Fastest model for simple tasks
- Requires active internet connection
- API usage is billed by Anthropic
- API key is stored securely in your user profile
- Data is processed in Anthropic's cloud
A: Yes! PoshLLM now supports both Ollama and ClaudeCode CLI. Support for additional LLM systems (OpenAI, Azure OpenAI, etc.) may be added in future versions.
A: It depends on which LLM system you use:
- Ollama: All processing happens locally on your machine. No data is sent to the cloud.
- ClaudeCode: Your prompts are sent to Anthropic's API for processing. Data is processed in Anthropic's cloud infrastructure.
A: While PoshLLM is great for development and learning, be cautious with executing generated code in production without review. That's a nice way of saying absolutely not :)
A: For security. PoshLLM always prompts before executing generated code to prevent accidental execution of potentially harmful commands.
A: Yes! Use Set-PoshLLMConfiguration -Model "newmodel" or override per-command with the -Model parameter.
A: No, PoshLLM requires PowerShell 7.0 or later for optimal performance and modern features.
# Verify Ollama is running
ollama list
# Check if the service is accessible
Test-NetConnection -ComputerName localhost -Port 11434
# Restart Ollama if needed# List available models
ollama list
# Pull the model you want to use
ollama pull llama3:latest
# Update PoshLLM configuration
Set-PoshLLMConfiguration -Model "llama3:latest"- Use smaller models (e.g.,
qwen3:8binstead ofllama3:70b) - Reduce context size:
Set-PoshLLMConfiguration -ContextSize 2048 - Ensure Ollama has sufficient system resources
# Check if installed
Get-Module -ListAvailable PoshLLM
# Install if missing
Install-Module -Name PoshLLM -Scope CurrentUser -Force
# Import explicitly
Import-Module PoshLLM -ForceContributions are welcome! Here's how you can help:
- Fork the repository
- Create a feature branch (
git checkout -b feature/AmazingFeature) - Make your changes and test thoroughly
- Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
# Clone the repository
git clone https://github.com/DevPossible/PoshLLM.git
cd PoshLLM
# Run tests
Invoke-Pester -Path ./Tests/
# Test the module locally
Import-Module ./PoshLLM.psd1 -ForcePlease ensure:
- Code follows PowerShell best practices
- All tests pass
- New features include tests
- Documentation is updated
- Support for additional LLM providers (OpenAI, Azure OpenAI, Anthropic)
- Conversation history and context management
- Custom system prompts and personas
- Response caching for repeated queries
- Export/import conversation sessions
This project is licensed under the MIT License - see the LICENSE file for details.
- Built with β€οΈ by DevPossible LLC
- Powered by Ollama
- Inspired by the PowerShell and AI communities
- Issues: GitHub Issues
- Documentation: Wiki (coming soon)
If you find PoshLLM useful, please:
- β Star the repository
- π Report bugs and issues
- π‘ Suggest new features
- π’ Share with others
Made with PowerShell and AI | Copyright Β© 2025 DevPossible LLC