active library

forge

A Claude Code-inspired CLI for local LLMs with MCP support

Started 2025 Python

Resources & Distribution

Source Code

Package Registries

Forge - Provider-Agnostic LLM CLI

A beautiful, interactive CLI for language models with native MCP (Model Context Protocol) support. Works with both local models (Ollama) and cloud APIs (Anthropic).

Features

  • ๐Ÿ”Œ Provider-Agnostic: Seamlessly switch between Ollama, Anthropic, and more
  • ๐ŸŽจ Rich Terminal UI: Beautiful interface with syntax highlighting and markdown rendering
  • ๐Ÿ’ฌ Interactive Chat: Seamless conversation with any LLM provider
  • ๐Ÿ“ Project Awareness: Automatically understands your project context
  • ๐Ÿ”ง MCP Tool Support: Native integration with MCP servers (coming soon)
  • ๐Ÿ’พ Session Management: Save and load conversation sessions
  • โšก Streaming Responses: Real-time streaming with visual feedback
  • ๐ŸŽฏ Smart Context: Auto-includes relevant files based on your queries

Supported Providers

  • Ollama - Local models (llama3.2, mistral, etc.)
  • Anthropic - Claude models via API
  • Extensible - Easy to add new providers

Installation

cd /home/spinoza/github/forge
pip install -e .

Quick Start

Using Ollama (Local Models)

# 1. Make sure Ollama is installed and running
ollama serve

# 2. Pull a model
ollama pull llama3.2

# 3. Run Forge
forge

Using Anthropic (Cloud API)

# 1. Set your API key
export ANTHROPIC_API_KEY="sk-ant-..."

# 2. Run Forge (auto-detects Anthropic from env var)
forge

# Or explicitly specify
forge --provider anthropic --model claude-3-5-sonnet-20241022

Single Message Mode

forge "Explain this codebase"
forge --provider ollama --model llama3.2 "What is in this directory?"

Configuration

Forge looks for configuration in these locations (in order):

  1. .forge.yaml in current directory
  2. forge.yaml in current directory
  3. ~/.forge/config.yaml
  4. ~/.config/forge/config.yaml

Example configurations:

For Ollama:

backend:
  provider: "ollama"
  api_url: "http://localhost:11434"
  model: "llama3.2"
  max_tokens: 2048
  temperature: 0.7

features:
  auto_context: true      # Auto-include relevant files
  auto_save: true         # Auto-save sessions
  use_tools: true         # Enable MCP tools
  syntax_highlight: true  # Syntax highlighting
  persist_sessions: true  # Save sessions
  project_aware: true     # Project detection

For Anthropic:

backend:
  provider: "anthropic"
  api_key: "sk-ant-..."  # Or use ANTHROPIC_API_KEY env var
  model: "claude-3-5-sonnet-20241022"
  max_tokens: 2048
  temperature: 0.7

Commands

While in interactive mode, use these commands:

  • /help or /h - Show available commands
  • /tools or /t - List available MCP tools
  • /model or /m - Switch model
  • /context or /c - Manage conversation context
  • /session or /s - Show session info
  • /save - Save current session
  • /load <name> - Load a previous session
  • /clear - Clear current session
  • /project or /p - Show project information
  • /exit or /q - Exit Forge

Usage Examples

Interactive Conversation

$ forge

โ•ญโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฎ
โ”‚ Forge v0.1.0                            โ”‚
โ”‚ Local LLM + MCP Tools                   โ”‚
โ”‚ Model: local-llama-3.2                  โ”‚
โ”‚ Project: myproject (Python)             โ”‚
โ”‚ Backend: http://localhost:8000/v1       โ”‚
โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ•ฏ

You: implement a function to parse JSON files