How to Run Claude Code Locally for Free with the Open‑Source Free Claude Code Proxy

This guide introduces the open‑source Free Claude Code project, explains its FastAPI‑based proxy architecture that routes Claude Code requests to various backends such as NVIDIA NIM, OpenRouter, DeepSeek, LM Studio, llama.cpp, and Ollama, and provides step‑by‑step instructions for installation, configuration, and deployment on local machines.

Geek Labs
Geek Labs
Geek Labs
How to Run Claude Code Locally for Free with the Open‑Source Free Claude Code Proxy

Free Claude Code is a recently popular open‑source project on GitHub that lets developers use the Claude Code programming assistant locally without cost by acting as a proxy between Claude Code and multiple AI backends.

Core Features

Free Claude Code is an Anthropic API proxy that forwards Claude Code traffic to free or self‑hosted models. Supported backends include:

NVIDIA NIM – free NVIDIA API

OpenRouter – multi‑model aggregation platform

DeepSeek – domestic model with high cost‑performance

LM Studio – local model service

llama.cpp – locally run LLaMA models

Ollama – local model execution tool

Key capabilities are:

Transparent proxy for Claude Code without modifying the original tool

Model‑level routing (Opus, Sonnet, Haiku)

Support for streaming output, tool calls, and reasoning mode

Remote programming via Discord or Telegram bots

Local Whisper speech‑to‑text integration

Technical Architecture

The service is a FastAPI application that sits between Claude Code and the chosen AI provider.

Request flow:

Claude Code sends a request to the local proxy (default port 8082).

The proxy converts the request to the target provider’s format.

The provider’s response is transformed back into Claude Code’s expected format.

Claude Code processes the response normally.

Supported invocation methods are the Claude Code CLI, VS Code extension, JetBrains plugin, and Discord/Telegram bots.

The project is written in Python 3.14, uses uv as the package manager, and its code structure is clear, making it suitable for learning FastAPI and API‑proxy development.

Using the Proxy in China

DeepSeek is a domestic service and can be accessed directly. NVIDIA NIM may require a network proxy in some regions. Local models are fully usable without internet.

For Chinese users, the recommendation is to use DeepSeek as the backend because of its stable API and low cost.

Quick Start

Install Claude Code : first install the official Claude Code tool, then install uv and Python 3.14.

Clone and configure :

git clone https://github.com/Alishahryar1/free-claude-code.git
cd free-claude-code
cp .env.example .env

Edit the .env file to set the DeepSeek backend:

DEEPSEEK_API_KEY="your-deepseek-api-key"
MODEL="deepseek/deepseek-chat"
ANTHROPIC_AUTH_TOKEN="freecc"

Start the proxy :

uv run uvicorn server:app --host 0.0.0.0 --port 8082

Run Claude Code with the proxy:

ANTHROPIC_AUTH_TOKEN="freecc" ANTHROPIC_BASE_URL="http://localhost:8082" claude

Applicable Scenarios

Suitable for developers who want a free Claude Code experience, need a locally run AI coding assistant, have data‑privacy requirements, or prefer domestic models.

Considerations: some technical knowledge is required for configuration, certain backends need API keys, and local models demand sufficient hardware.

GitHub: https://github.com/Alishahryar1/free-claude-code<br/>Stars: 17,801<br/>Language: Python<br/>License: MIT
Free Claude Code interface
Free Claude Code interface
Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

ProxyPythonopen sourceFastAPIAI assistantLocal deploymentClaude Code
Geek Labs
Written by

Geek Labs

Daily shares of interesting GitHub open-source projects. AI tools, automation gems, technical tutorials, open-source inspiration.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.