active
library
forge
A Claude Code-inspired CLI for local LLMs with MCP support
Resources & Distribution
Forge - Provider-Agnostic LLM CLI
A beautiful, interactive CLI for language models with native MCP (Model Context Protocol) support. Works with both local models (Ollama) and cloud APIs (Anthropic).
Features
- ๐ Provider-Agnostic: Seamlessly switch between Ollama, Anthropic, and more
- ๐จ Rich Terminal UI: Beautiful interface with syntax highlighting and markdown rendering
- ๐ฌ Interactive Chat: Seamless conversation with any LLM provider
- ๐ Project Awareness: Automatically understands your project context
- ๐ง MCP Tool Support: Native integration with MCP servers (coming soon)
- ๐พ Session Management: Save and load conversation sessions
- โก Streaming Responses: Real-time streaming with visual feedback
- ๐ฏ Smart Context: Auto-includes relevant files based on your queries
Supported Providers
- Ollama - Local models (llama3.2, mistral, etc.)
- Anthropic - Claude models via API
- Extensible - Easy to add new providers
Installation
cd /home/spinoza/github/forge
pip install -e .
Quick Start
Using Ollama (Local Models)
# 1. Make sure Ollama is installed and running
ollama serve
# 2. Pull a model
ollama pull llama3.2
# 3. Run Forge
forge
Using Anthropic (Cloud API)
# 1. Set your API key
export ANTHROPIC_API_KEY="sk-ant-..."
# 2. Run Forge (auto-detects Anthropic from env var)
forge
# Or explicitly specify
forge --provider anthropic --model claude-3-5-sonnet-20241022
Single Message Mode
forge "Explain this codebase"
forge --provider ollama --model llama3.2 "What is in this directory?"
Configuration
Forge looks for configuration in these locations (in order):
.forge.yamlin current directoryforge.yamlin current directory~/.forge/config.yaml~/.config/forge/config.yaml
Example configurations:
For Ollama:
backend:
provider: "ollama"
api_url: "http://localhost:11434"
model: "llama3.2"
max_tokens: 2048
temperature: 0.7
features:
auto_context: true # Auto-include relevant files
auto_save: true # Auto-save sessions
use_tools: true # Enable MCP tools
syntax_highlight: true # Syntax highlighting
persist_sessions: true # Save sessions
project_aware: true # Project detection
For Anthropic:
backend:
provider: "anthropic"
api_key: "sk-ant-..." # Or use ANTHROPIC_API_KEY env var
model: "claude-3-5-sonnet-20241022"
max_tokens: 2048
temperature: 0.7
Commands
While in interactive mode, use these commands:
/helpor/h- Show available commands/toolsor/t- List available MCP tools/modelor/m- Switch model/contextor/c- Manage conversation context/sessionor/s- Show session info/save- Save current session/load <name>- Load a previous session/clear- Clear current session/projector/p- Show project information/exitor/q- Exit Forge
Usage Examples
Interactive Conversation
$ forge
โญโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฎ
โ Forge v0.1.0 โ
โ Local LLM + MCP Tools โ
โ Model: local-llama-3.2 โ
โ Project: myproject (Python) โ
โ Backend: http://localhost:8000/v1 โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
You: implement a function to parse JSON files